Part A:[5 pts] Load and preprocess the data using Pandas and remove the unneeded attributes. For the purpose of this assignment you do not need to normalize or standardize the data unless explicitly required in one of the following tasks. However, you may need to handle missing values by imputing those values based on variable means. Compute and display basic statistics (mean, standard deviation, min, max, etc.) for the variables in the data set. Separate the target attribute for regression. Use scikit-learn's train_test_split function to create a 20%-80% randomized split of the data (important note: for reporducible output across multiple runs, please use "random_state = 33"). Set aside the 20% test portion; the 80% training data partition will be used for cross-validation on various tasks specified below.¶
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from sklearn import preprocessing
from sklearn.linear_model import LinearRegression, Lasso, Ridge, ElasticNet, SGDRegressor
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error, mean_absolute_error
from sklearn.model_selection import KFold, cross_val_score
from sklearn.feature_selection import SelectPercentile, f_regression
from sklearn.feature_extraction.text import TfidfTransformer
from sklearn.cluster import KMeans
from wordcloud import WordCloud
from sklearn.metrics import completeness_score, homogeneity_score
from sklearn.metrics.pairwise import cosine_similarity
from sklearn.linear_model import SGDRegressor
from sklearn.model_selection import GridSearchCV
from sklearn.preprocessing import StandardScaler
from sklearn.metrics.pairwise import cosine_similarity
import warnings
from sklearn.metrics import completeness_score, homogeneity_score
# Ignore specific warnings
warnings.filterwarnings("ignore", category=FutureWarning, module="sklearn.cluster._kmeans")
warnings.filterwarnings("ignore", category=UserWarning, module="sklearn.cluster._kmeans")
warnings.filterwarnings("ignore", category=UserWarning, module="sklearn.metrics.cluster._supervised")
# Read the data
crime_data = pd.read_csv('N:/Programming machine learning/assignment3/communities/communities.csv', sep=',', na_values=['?'])
crime_data.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 1994 entries, 0 to 1993 Data columns (total 100 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 state 1994 non-null int64 1 communityname 1994 non-null object 2 population 1994 non-null float64 3 householdsize 1994 non-null float64 4 racepctblack 1994 non-null float64 5 racePctWhite 1994 non-null float64 6 racePctAsian 1994 non-null float64 7 racePctHisp 1994 non-null float64 8 agePct12t21 1994 non-null float64 9 agePct12t29 1994 non-null float64 10 agePct16t24 1994 non-null float64 11 agePct65up 1994 non-null float64 12 numbUrban 1994 non-null float64 13 pctUrban 1994 non-null float64 14 medIncome 1994 non-null float64 15 pctWWage 1994 non-null float64 16 pctWFarmSelf 1994 non-null float64 17 pctWInvInc 1994 non-null float64 18 pctWSocSec 1994 non-null float64 19 pctWPubAsst 1994 non-null float64 20 pctWRetire 1994 non-null float64 21 medFamInc 1994 non-null float64 22 perCapInc 1994 non-null float64 23 whitePerCap 1994 non-null float64 24 blackPerCap 1994 non-null float64 25 indianPerCap 1994 non-null float64 26 AsianPerCap 1994 non-null float64 27 OtherPerCap 1993 non-null float64 28 HispPerCap 1994 non-null float64 29 NumUnderPov 1994 non-null float64 30 PctPopUnderPov 1994 non-null float64 31 PctLess9thGrade 1994 non-null float64 32 PctNotHSGrad 1994 non-null float64 33 PctBSorMore 1994 non-null float64 34 PctUnemployed 1994 non-null float64 35 PctEmploy 1994 non-null float64 36 PctEmplManu 1994 non-null float64 37 PctEmplProfServ 1994 non-null float64 38 MalePctDivorce 1994 non-null float64 39 MalePctNevMarr 1994 non-null float64 40 FemalePctDiv 1994 non-null float64 41 TotalPctDiv 1994 non-null float64 42 PersPerFam 1994 non-null float64 43 PctFam2Par 1994 non-null float64 44 PctKids2Par 1994 non-null float64 45 PctYoungKids2Par 1994 non-null float64 46 PctTeen2Par 1994 non-null float64 47 PctWorkMomYoungKids 1994 non-null float64 48 PctWorkMom 1994 non-null float64 49 NumIlleg 1994 non-null float64 50 PctIlleg 1994 non-null float64 51 NumImmig 1994 non-null float64 52 PctImmigRecent 1994 non-null float64 53 PctImmigRec5 1994 non-null float64 54 PctImmigRec8 1994 non-null float64 55 PctImmigRec10 1994 non-null float64 56 PctRecentImmig 1994 non-null float64 57 PctRecImmig5 1994 non-null float64 58 PctRecImmig8 1994 non-null float64 59 PctRecImmig10 1994 non-null float64 60 PctSpeakEnglOnly 1994 non-null float64 61 PctNotSpeakEnglWell 1994 non-null float64 62 PctLargHouseFam 1994 non-null float64 63 PctLargHouseOccup 1994 non-null float64 64 PersPerOccupHous 1994 non-null float64 65 PersPerOwnOccHous 1994 non-null float64 66 PersPerRentOccHous 1994 non-null float64 67 PctPersOwnOccup 1994 non-null float64 68 PctPersDenseHous 1994 non-null float64 69 PctHousLess3BR 1994 non-null float64 70 MedNumBR 1994 non-null float64 71 HousVacant 1994 non-null float64 72 PctHousOccup 1994 non-null float64 73 PctHousOwnOcc 1994 non-null float64 74 PctVacantBoarded 1994 non-null float64 75 PctVacMore6Mos 1994 non-null float64 76 MedYrHousBuilt 1994 non-null float64 77 PctHousNoPhone 1994 non-null float64 78 PctWOFullPlumb 1994 non-null float64 79 OwnOccLowQuart 1994 non-null float64 80 OwnOccMedVal 1994 non-null float64 81 OwnOccHiQuart 1994 non-null float64 82 RentLowQ 1994 non-null float64 83 RentMedian 1994 non-null float64 84 RentHighQ 1994 non-null float64 85 MedRent 1994 non-null float64 86 MedRentPctHousInc 1994 non-null float64 87 MedOwnCostPctInc 1994 non-null float64 88 MedOwnCostPctIncNoMtg 1994 non-null float64 89 NumInShelters 1994 non-null float64 90 NumStreet 1994 non-null float64 91 PctForeignBorn 1994 non-null float64 92 PctBornSameState 1994 non-null float64 93 PctSameHouse85 1994 non-null float64 94 PctSameCity85 1994 non-null float64 95 PctSameState85 1994 non-null float64 96 LandArea 1994 non-null float64 97 PopDens 1994 non-null float64 98 PctUsePubTrans 1994 non-null float64 99 ViolentCrimesPerPop 1994 non-null float64 dtypes: float64(98), int64(1), object(1) memory usage: 1.5+ MB
# Check the shape of the dataset
print("Shape of the dataset:", crime_data.shape)
Shape of the dataset: (1994, 100)
Impute missing values with the mean of the respective column¶
#Handling missing values by imputing those values based on the variable means for numeric columns
numeric_cols = crime_data.select_dtypes(include=[np.number]).columns
crime_data[numeric_cols] = crime_data[numeric_cols].fillna(crime_data[numeric_cols].mean())
Misising values¶
# Verify that there are no more missing values
missing_values = crime_data.isna().sum()
print("Missing values after imputation:", missing_values[missing_values > 0])
Missing values after imputation: Series([], dtype: int64)
Compute and display basic statistics for the dataset¶
# Compute and display basic statistics for the variables in the data set
print("Basic statistics for the variables:")
crime_data.describe().T
Basic statistics for the variables:
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| state | 1994.0 | 28.683551 | 16.397553 | 1.0 | 12.00 | 34.00 | 42.00 | 56.0 |
| population | 1994.0 | 0.057593 | 0.126906 | 0.0 | 0.01 | 0.02 | 0.05 | 1.0 |
| householdsize | 1994.0 | 0.463395 | 0.163717 | 0.0 | 0.35 | 0.44 | 0.54 | 1.0 |
| racepctblack | 1994.0 | 0.179629 | 0.253442 | 0.0 | 0.02 | 0.06 | 0.23 | 1.0 |
| racePctWhite | 1994.0 | 0.753716 | 0.244039 | 0.0 | 0.63 | 0.85 | 0.94 | 1.0 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... |
| PctSameState85 | 1994.0 | 0.651530 | 0.198221 | 0.0 | 0.56 | 0.70 | 0.79 | 1.0 |
| LandArea | 1994.0 | 0.065231 | 0.109459 | 0.0 | 0.02 | 0.04 | 0.07 | 1.0 |
| PopDens | 1994.0 | 0.232854 | 0.203092 | 0.0 | 0.10 | 0.17 | 0.28 | 1.0 |
| PctUsePubTrans | 1994.0 | 0.161685 | 0.229055 | 0.0 | 0.02 | 0.07 | 0.19 | 1.0 |
| ViolentCrimesPerPop | 1994.0 | 0.237979 | 0.232985 | 0.0 | 0.07 | 0.15 | 0.33 | 1.0 |
99 rows × 8 columns
Separate the target variable 'ViolentCrimesPerPop' from the features¶
#Separate the target attribute for regression from the rest of the attributes of the data set
vs_target = crime_data['ViolentCrimesPerPop']
crime_data.drop(columns=['ViolentCrimesPerPop', 'communityname', 'state'], inplace=True)
# Convert the original data set and the response variable into numpy arrays
X = crime_data.to_numpy()
y = vs_target.to_numpy()
print("Shape of X:", X.shape)
print("Shape of y:", y.shape)
Shape of X: (1994, 97) Shape of y: (1994,)
#Extract the feature names
feature_names = crime_data.columns.values
print("Feature names:", feature_names)
Feature names: ['population' 'householdsize' 'racepctblack' 'racePctWhite' 'racePctAsian' 'racePctHisp' 'agePct12t21' 'agePct12t29' 'agePct16t24' 'agePct65up' 'numbUrban' 'pctUrban' 'medIncome' 'pctWWage' 'pctWFarmSelf' 'pctWInvInc' 'pctWSocSec' 'pctWPubAsst' 'pctWRetire' 'medFamInc' 'perCapInc' 'whitePerCap' 'blackPerCap' 'indianPerCap' 'AsianPerCap' 'OtherPerCap' 'HispPerCap' 'NumUnderPov' 'PctPopUnderPov' 'PctLess9thGrade' 'PctNotHSGrad' 'PctBSorMore' 'PctUnemployed' 'PctEmploy' 'PctEmplManu' 'PctEmplProfServ' 'MalePctDivorce' 'MalePctNevMarr' 'FemalePctDiv' 'TotalPctDiv' 'PersPerFam' 'PctFam2Par' 'PctKids2Par' 'PctYoungKids2Par' 'PctTeen2Par' 'PctWorkMomYoungKids' 'PctWorkMom' 'NumIlleg' 'PctIlleg' 'NumImmig' 'PctImmigRecent' 'PctImmigRec5' 'PctImmigRec8' 'PctImmigRec10' 'PctRecentImmig' 'PctRecImmig5' 'PctRecImmig8' 'PctRecImmig10' 'PctSpeakEnglOnly' 'PctNotSpeakEnglWell' 'PctLargHouseFam' 'PctLargHouseOccup' 'PersPerOccupHous' 'PersPerOwnOccHous' 'PersPerRentOccHous' 'PctPersOwnOccup' 'PctPersDenseHous' 'PctHousLess3BR' 'MedNumBR' 'HousVacant' 'PctHousOccup' 'PctHousOwnOcc' 'PctVacantBoarded' 'PctVacMore6Mos' 'MedYrHousBuilt' 'PctHousNoPhone' 'PctWOFullPlumb' 'OwnOccLowQuart' 'OwnOccMedVal' 'OwnOccHiQuart' 'RentLowQ' 'RentMedian' 'RentHighQ' 'MedRent' 'MedRentPctHousInc' 'MedOwnCostPctInc' 'MedOwnCostPctIncNoMtg' 'NumInShelters' 'NumStreet' 'PctForeignBorn' 'PctBornSameState' 'PctSameHouse85' 'PctSameCity85' 'PctSameState85' 'LandArea' 'PopDens' 'PctUsePubTrans']
# Ensure all arrays are of the same length
if X.shape[0] == y.shape[0]:
print("All arrays are of the same length.")
else:
print("Arrays are not of the same length.")
All arrays are of the same length.
Split the data into training and testing sets (80% training, 20% testing)¶
# Use scikit-learn's train_test_split function to create a 20%-80% randomized split of the data
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=33)
print("Shape of X_train:", X_train.shape)
print("Shape of X_test:", X_test.shape)
print("Shape of y_train:", y_train.shape)
print("Shape of y_test:", y_test.shape)
Shape of X_train: (1595, 97) Shape of X_test: (399, 97) Shape of y_train: (1595,) Shape of y_test: (399,)
X_train_df = pd.DataFrame(X_train, columns=feature_names)
X_train_df.head()
| population | householdsize | racepctblack | racePctWhite | racePctAsian | racePctHisp | agePct12t21 | agePct12t29 | agePct16t24 | agePct65up | ... | NumInShelters | NumStreet | PctForeignBorn | PctBornSameState | PctSameHouse85 | PctSameCity85 | PctSameState85 | LandArea | PopDens | PctUsePubTrans | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0.01 | 0.54 | 0.02 | 0.91 | 0.27 | 0.04 | 0.37 | 0.41 | 0.25 | 0.28 | ... | 0.01 | 0.00 | 0.29 | 0.42 | 0.72 | 0.74 | 0.62 | 0.01 | 0.20 | 0.47 |
| 1 | 0.99 | 0.42 | 0.59 | 0.44 | 0.11 | 0.11 | 0.44 | 0.56 | 0.37 | 0.39 | ... | 0.30 | 0.12 | 0.14 | 0.71 | 0.52 | 0.79 | 0.75 | 0.28 | 0.55 | 0.62 |
| 2 | 0.01 | 0.53 | 0.02 | 0.95 | 0.15 | 0.03 | 0.27 | 0.37 | 0.18 | 0.23 | ... | 0.01 | 0.00 | 0.17 | 0.54 | 0.38 | 0.24 | 0.51 | 0.11 | 0.03 | 0.09 |
| 3 | 0.07 | 0.41 | 0.02 | 0.97 | 0.05 | 0.02 | 0.32 | 0.49 | 0.31 | 0.44 | ... | 0.00 | 0.00 | 0.13 | 0.88 | 0.76 | 0.74 | 0.84 | 0.05 | 0.27 | 0.40 |
| 4 | 0.05 | 0.41 | 0.11 | 0.85 | 0.23 | 0.03 | 0.32 | 0.52 | 0.26 | 0.10 | ... | 0.00 | 0.00 | 0.17 | 0.33 | 0.20 | 0.23 | 0.05 | 0.09 | 0.12 | 0.01 |
5 rows × 97 columns
PART B:[10 pts] Perform standard multiple linear regression on data using the scikit-learn Linear Regression module. Compute the RMSE values on the full training data (the 80% partition). Also, plot the correlation between the predicted and actual values of the target attribute. Display the obtained regression coefficients (weights) and plot them using matplotlib. Finally, perform 10-fold cross-validation on the training partition and compare the cross-validation RMSE to the training RMSE (for cross validation, you should use the KFold module from sklearn.model_selection).¶
lm = LinearRegression()
lm.fit(X_train, y_train)
LinearRegression()In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
LinearRegression()
# Display the regression coefficients
print("Regression coefficients:", lm.coef_)
Regression coefficients: [-7.52656629e-03 -5.28396680e-02 2.21547570e-01 -5.03199162e-02 -1.96010315e-02 4.76949814e-02 1.02285177e-01 -1.69896511e-01 -1.26211473e-01 7.14204649e-02 -1.25853261e-01 3.87986180e-02 -2.01918099e-01 -2.13121910e-01 4.21246697e-02 -7.44445905e-02 1.24852413e-02 2.25201184e-02 -9.58013048e-02 3.53979558e-01 7.57326633e-02 -3.52089855e-01 -2.93676330e-02 -4.15893659e-02 3.33220798e-02 4.71028688e-02 3.89021825e-02 2.97770808e-01 -2.38551315e-01 -8.53275102e-03 -4.53960097e-02 7.67420267e-02 1.56572032e-02 2.25051684e-01 -4.97873716e-02 -3.91366512e-02 3.46194760e-01 2.11727329e-01 7.54492064e-02 -3.59121087e-01 -1.58937159e-01 -1.73716203e-02 -2.60253644e-01 -4.63158082e-02 -9.00374174e-03 8.91193223e-03 -1.57889829e-01 -2.66670112e-01 1.43290159e-01 -1.18798830e-01 8.40395513e-03 4.89371965e-02 -1.88835442e-01 1.36372557e-01 -3.22691120e-02 -2.44000729e-01 6.40813099e-01 -3.81532245e-01 -3.50231164e-04 -1.47935514e-01 5.34217081e-02 -2.16009746e-01 5.72999805e-01 2.33844078e-02 -2.16469370e-01 -7.33035824e-01 2.02860550e-01 1.46593734e-01 3.48978808e-02 1.46645983e-01 -4.14065836e-02 6.51521228e-01 7.32653271e-02 -7.68672044e-02 -2.95905282e-02 5.41657059e-02 -3.44943760e-02 -2.36146882e-01 9.50363700e-02 9.48489471e-02 -2.51834189e-01 -8.41867124e-02 -9.68358486e-02 3.85569530e-01 6.27153658e-02 -3.02791431e-02 -7.75663520e-02 1.38903521e-01 1.26440321e-01 1.69036946e-01 -7.44142382e-03 -3.02836640e-02 4.16837519e-02 1.54057146e-02 1.94982907e-02 9.90036783e-04 -4.23656568e-02]
len(feature_names)
97
# Predict on the training set
y_train_pred = lm.predict(X_train)
# Plot the correlation between the predicted and actual values
plt.figure(figsize=(8, 6))
plt.scatter(y_train, y_train_pred, alpha=0.5)
plt.plot([min(y_train), max(y_train)], [min(y_train_pred), max(y_train_pred)], 'r--')
plt.xlabel('Actual Values')
plt.ylabel('Predicted Values')
plt.title('Actual vs Predicted Values')
plt.show()
def plot_coefficients(model, n_features, feature_names):
plt.rcParams.update({'font.size': 18})
coefficients = pd.DataFrame({'Feature': feature_names, 'Coefficient': model.coef_})
coefficients.sort_values(by='Coefficient', ascending=False, inplace=True)
print('The Top 5 important features:')
print(coefficients.head(5))
print('\nThe last 5 less important features:')
print(coefficients.tail(5))
plt.figure(figsize=(12, 25))
plt.barh(coefficients['Feature'], coefficients['Coefficient'], color='orange')
plt.xlabel('Coefficient Value')
plt.ylabel('Features')
plt.title('Features vs Regression Coefficients')
plt.gca().invert_yaxis() # Invert y-axis to have the largest coefficient on top
plt.show()
plot_coefficients(lm, len(feature_names), feature_names)
The Top 5 important features:
Feature Coefficient
71 PctHousOwnOcc 0.651521
56 PctRecImmig8 0.640813
62 PersPerOccupHous 0.573000
83 MedRent 0.385570
19 medFamInc 0.353980
The last 5 less important features:
Feature Coefficient
47 NumIlleg -0.266670
21 whitePerCap -0.352090
39 TotalPctDiv -0.359121
57 PctRecImmig10 -0.381532
65 PctPersOwnOccup -0.733036
n = 10
kf = KFold(n_splits=n)
kf.get_n_splits(X_train)
xval_err = 0
for train_index, test_index in kf.split(X_train):
lm.fit(X_train[train_index], y_train[train_index])
p = lm.predict(X_train[test_index])
e = p - y_train[test_index]
xval_err += np.sqrt(np.dot(e, e) / len(X_train[test_index]))
#the average RMSE cross-validation score by dividing by n
rmse_10cv = xval_err / n
# Predict on the training set
y_train_pred = lm.predict(X_train)
# Calculate the RMSE on the training set
rmse_train = np.sqrt(mean_squared_error(y_train, y_train_pred))
# Print the results
method_name = 'Simple Linear Regression'
print('Method:', method_name)
print('RMSE on training: %.4f' % rmse_train)
print('RMSE on 10-fold CV: %.4f' % rmse_10cv)
Method: Simple Linear Regression RMSE on training: 0.1267 RMSE on 10-fold CV: 0.1343
PART C.[15 pts] Feature Selection: use the scikit-learn to select the best subset of features to perform linear regression. For feature selection, write a script or function that takes as input the training data; target variable; the regression model; and any other parameters you find necessary. The function should return the optimal percentage of the most informative features to use. Your approach should use k-fold cross-validation on the training data (use k=5 for consistency) and use feature_selection.SelectPercentile to find the most informative variables for a range of percentile values [Note: since this is regression not classification, in the SelectPercentile function you should use feature_selection.f_regression as scoring function rather than chi2). You should also plot the model's error values on cross-validation using only the selected features across the range of percentile values. For variety, in this part we will use Mean Absolute Error (MAE) as the error metric instead of RMSE. For cross-validation, use scikit's cross_val_score function. In order to use cross_val_score with regression you'll need to pass to it a specific error function. In this case, you will use scoring='neg_mean_absolute_error' as a parameter. You should use aboslute values to convert these negated MAE values to positive MAE values. Your plot should look similar (but won't be exactly the same as this example). Once you have identified the best percentile based on cross-validation, use it to identify and display the corresponding best features. As a final step, train your model on the full 80% training data with the optimal subset of features and then compute it's peformance (again using MAE) on the set-aside 20% test partition.¶
[Note: For an example of a similar feature selection process please review the class example notebook (though note that the task in this example was classification not regression). Also, review scikit-learn documentation for feature selection.]
lm = LinearRegression()
lm.fit(X_train, y_train)
LinearRegression()In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
LinearRegression()
# Feature selection
percentiles = range(1, 101, 5)
results = []
for i in percentiles:
fs = SelectPercentile(f_regression, percentile=i)
X_train_fs = fs.fit_transform(X_train, y_train)
scores = -cross_val_score(lm, X_train_fs, y_train, cv=5, scoring='neg_mean_absolute_error')
print(f"Percentile: {i}, Mean Absolute Error: {scores.mean()}")
results.append(scores.mean())
optimal_percentile = np.argmin(results)
optimal_num_features = int(percentiles[optimal_percentile] * X_train.shape[1] / 100)
print(f"Optimal percentile: {percentiles[optimal_percentile]}")
print(f"Number of features: {optimal_num_features}")
Percentile: 1, Mean Absolute Error: 0.11287401163402822 Percentile: 6, Mean Absolute Error: 0.09943268475677308 Percentile: 11, Mean Absolute Error: 0.09900510519125442 Percentile: 16, Mean Absolute Error: 0.0990905907952681 Percentile: 21, Mean Absolute Error: 0.09750686221310859 Percentile: 26, Mean Absolute Error: 0.09608904536542377 Percentile: 31, Mean Absolute Error: 0.09478480270197638 Percentile: 36, Mean Absolute Error: 0.09474955355002232 Percentile: 41, Mean Absolute Error: 0.09435370428857698 Percentile: 46, Mean Absolute Error: 0.09466441760571027 Percentile: 51, Mean Absolute Error: 0.09498773346781053 Percentile: 56, Mean Absolute Error: 0.0953208516480594 Percentile: 61, Mean Absolute Error: 0.09574966889254498 Percentile: 66, Mean Absolute Error: 0.09572089544915725 Percentile: 71, Mean Absolute Error: 0.09583132229767206 Percentile: 76, Mean Absolute Error: 0.09519157369784723 Percentile: 81, Mean Absolute Error: 0.09542568827536392 Percentile: 86, Mean Absolute Error: 0.09513624310358182 Percentile: 91, Mean Absolute Error: 0.09525366581598069 Percentile: 96, Mean Absolute Error: 0.0952762785288391 Optimal percentile: 41 Number of features: 39
# Plotting the performance
plt.figure(figsize=(8, 6))
plt.plot(percentiles, results)
plt.xlabel('Percentile of features')
plt.ylabel('Mean Absolute Error')
plt.title('Feature Selection')
plt.grid(True)
plt.show()
# Selecting the best features
fs = SelectPercentile(f_regression, percentile=percentiles[optimal_percentile])
X_train_best = fs.fit_transform(X_train, y_train)
X_test_best = fs.transform(X_test)
# Train and evaluate the model with the selected features
lm.fit(X_train_best, y_train)
y_pred = lm.predict(X_test_best)
mae = mean_absolute_error(y_test, y_pred)
print(f"Mean Absolute Error with selected features: {mae}")
Mean Absolute Error with selected features: 0.10127083211050729
np.set_printoptions(suppress=True, precision=2, linewidth=80)
print(fs.get_support())
print(fs.scores_)
[ True False True True False False False False False False True False True
True False True False True False True True False False False False False
False True True True True True True True False False True False True
True False True True True True False False True True False False False
False False False False False False False False True False False False False
True True True True True True True True False False True True False
False False False False False False True False False True True False False
False False False False False False]
[ 256.03 3.49 1074.89 1412.51 1.96 137.54 4.89 37.33 15.88
8.25 247.78 11.06 341.63 161.57 40.34 756.01 22.49 784.57
13.98 366.9 214.99 65.45 126.58 12.32 29.07 33.37 92.37
400.31 578.82 319.22 472.2 173.37 533.91 190.47 2.61 7.4
602.77 161.08 704.77 693.14 24.47 1657.87 1972.8 1342.17 1299.54
1.31 36.5 431.64 1936.12 151.62 40.65 70.23 96.28 134.79
85.2 101.54 106.97 116.9 92.38 144.66 241.23 129.54 4.35
28.4 93.29 630.76 375.65 487.96 242.07 339.97 173.74 478.53
507.64 0.27 18.64 505.86 209.11 71.34 57.98 46.44 104.03
95.72 90.25 95.92 178.68 6.72 4.32 265.85 169.53 61.96
9.71 40.93 9.17 0.8 61.98 142.69 40.12]
selected_support = fs.get_support()
selected_scores = fs.scores_
print("Feature \t Score")
for i in range(len(feature_names)):
if selected_support[i]:
print(f"{feature_names[i]} \t {selected_scores[i]:.4f}")
Feature Score population 256.0324 racepctblack 1074.8895 racePctWhite 1412.5078 numbUrban 247.7761 medIncome 341.6325 pctWWage 161.5708 pctWInvInc 756.0150 pctWPubAsst 784.5706 medFamInc 366.9004 perCapInc 214.9932 NumUnderPov 400.3079 PctPopUnderPov 578.8174 PctLess9thGrade 319.2181 PctNotHSGrad 472.1962 PctBSorMore 173.3668 PctUnemployed 533.9132 PctEmploy 190.4653 MalePctDivorce 602.7749 FemalePctDiv 704.7727 TotalPctDiv 693.1378 PctFam2Par 1657.8650 PctKids2Par 1972.8041 PctYoungKids2Par 1342.1736 PctTeen2Par 1299.5351 NumIlleg 431.6356 PctIlleg 1936.1209 PctLargHouseFam 241.2338 PctPersOwnOccup 630.7603 PctPersDenseHous 375.6459 PctHousLess3BR 487.9617 MedNumBR 242.0664 HousVacant 339.9686 PctHousOccup 173.7394 PctHousOwnOcc 478.5304 PctVacantBoarded 507.6364 PctHousNoPhone 505.8627 PctWOFullPlumb 209.1149 MedRentPctHousInc 178.6842 NumInShelters 265.8459 NumStreet 169.5343
Part D[10 pts] Next, using the original train and test data in part (a), perform Ridge Regression and Lasso Regression using the modules from sklearn.linear_model. In each case, perform systematic model selection to identify the optimal alpha hyperparameter (the regularization coefficient). You should create a function that takes as input the training data and target variable; the parameter to vary and a list of its values; the model to be trained; and any other relevant input needed to determine the optimal value for the specified parameter. The model selection process should perform k-fold cross validation (k should be a parameter, but you can select k=5 for this problem). For each model, you should also plot the error values (this time using RMSE as the error metric) on the training and cross-validation splits across the specified values of alpha. Finally, using the best alpha values for each regression model, train the model on the full training data and evaluate it on the set-aside test data. Discuss your observations and conclusions, especially about the impact of alpha on bias-variance trade-off. [Hint: for an example of a similar model optimization process please review the class example notebook.]¶
def calculate_parameters(X, y, model, param_values, param_name, K):
print('{} Regression:'.format(type(model).__name__))
print(param_name + ' RMSE_train\t RMSE_cv\n')
t_rmse = np.array([])
cv_rmse = np.array([])
for param_value in param_values:
model.set_params(**{param_name: param_value})
model.fit(X, y)
# Training RMSE
p_train = model.predict(X)
rmse_train = np.sqrt(mean_squared_error(y, p_train))
t_rmse = np.append(t_rmse, [rmse_train])
# Cross-validation RMSE
kf = KFold(n_splits=K, shuffle=True, random_state=33)
xval_err = 0
for train_index, test_index in kf.split(X):
model.fit(X[train_index], y[train_index])
p = model.predict(X[test_index])
e = p - y[test_index]
xval_err += np.dot(e, e)
rmse_cv = np.sqrt(xval_err / len(X))
cv_rmse = np.append(cv_rmse, [rmse_cv])
# Print the current parameter value and its corresponding RMSEs
print('{:.3f}\t {:.4f}\t\t {:.4f}'.format(param_value, rmse_train, rmse_cv))
# Identify and print the optimal alpha value
min_err_idx = np.argmin(cv_rmse)
min_err_alpha = param_values[min_err_idx]
print('\nThe minimum error using cross validation is: {:.4f} with {} : {:.4f}'.format(cv_rmse[min_err_idx], param_name, min_err_alpha))
# Plotting
plt.figure()
plt.plot(param_values, t_rmse, label='RMSE-Train')
plt.plot(param_values, cv_rmse, label='RMSE-CV')
plt.xlabel(param_name)
plt.ylabel('RMSE')
plt.legend()
plt.show()
return t_rmse, cv_rmse
ridge = Ridge()
alphas = np.linspace(0.01, 10, 50)
train_rmse, cv_rmse = calculate_parameters(X_train, y_train, ridge, alphas, 'alpha', 5)
optimal_alpha_ridge = alphas[np.argmin(cv_rmse)]
print(f"Optimal alpha for Ridge Regression: {optimal_alpha_ridge}")
Ridge Regression: alpha RMSE_train RMSE_cv 0.010 0.1262 0.1359 0.214 0.1265 0.1349 0.418 0.1268 0.1346 0.622 0.1271 0.1345 0.826 0.1273 0.1344 1.029 0.1274 0.1343 1.233 0.1276 0.1343 1.437 0.1278 0.1343 1.641 0.1279 0.1342 1.845 0.1280 0.1342 2.049 0.1281 0.1342 2.253 0.1283 0.1342 2.457 0.1284 0.1342 2.660 0.1285 0.1342 2.864 0.1286 0.1342 3.068 0.1287 0.1342 3.272 0.1288 0.1342 3.476 0.1288 0.1342 3.680 0.1289 0.1343 3.884 0.1290 0.1343 4.088 0.1291 0.1343 4.291 0.1292 0.1343 4.495 0.1292 0.1343 4.699 0.1293 0.1343 4.903 0.1294 0.1343 5.107 0.1294 0.1343 5.311 0.1295 0.1343 5.515 0.1296 0.1344 5.719 0.1296 0.1344 5.922 0.1297 0.1344 6.126 0.1297 0.1344 6.330 0.1298 0.1344 6.534 0.1298 0.1344 6.738 0.1299 0.1344 6.942 0.1299 0.1344 7.146 0.1300 0.1344 7.350 0.1300 0.1345 7.553 0.1301 0.1345 7.757 0.1301 0.1345 7.961 0.1302 0.1345 8.165 0.1302 0.1345 8.369 0.1303 0.1345 8.573 0.1303 0.1345 8.777 0.1304 0.1345 8.981 0.1304 0.1346 9.184 0.1304 0.1346 9.388 0.1305 0.1346 9.592 0.1305 0.1346 9.796 0.1306 0.1346 10.000 0.1306 0.1346 The minimum error using cross validation is: 0.1342 with alpha : 2.2527
Optimal alpha for Ridge Regression: 2.2526530612244895
lasso = Lasso(max_iter=10000)
alphas = np.linspace(0.0001, 1, 50)
train_rmse, cv_rmse = calculate_parameters(X_train, y_train, lasso, alphas, 'alpha', 5)
optimal_alpha_lasso = alphas[np.argmin(cv_rmse)]
print(f"Optimal alpha for Lasso Regression: {optimal_alpha_lasso}")
Lasso Regression: alpha RMSE_train RMSE_cv 0.000 0.1282 0.1347 0.021 0.1760 0.1763 0.041 0.2312 0.2312 0.061 0.2312 0.2312 0.082 0.2312 0.2312 0.102 0.2312 0.2312 0.123 0.2312 0.2312 0.143 0.2312 0.2312 0.163 0.2312 0.2312 0.184 0.2312 0.2312 0.204 0.2312 0.2312 0.225 0.2312 0.2312 0.245 0.2312 0.2312 0.265 0.2312 0.2312 0.286 0.2312 0.2312 0.306 0.2312 0.2312 0.327 0.2312 0.2312 0.347 0.2312 0.2312 0.367 0.2312 0.2312 0.388 0.2312 0.2312 0.408 0.2312 0.2312 0.429 0.2312 0.2312 0.449 0.2312 0.2312 0.469 0.2312 0.2312 0.490 0.2312 0.2312 0.510 0.2312 0.2312 0.531 0.2312 0.2312 0.551 0.2312 0.2312 0.571 0.2312 0.2312 0.592 0.2312 0.2312 0.612 0.2312 0.2312 0.633 0.2312 0.2312 0.653 0.2312 0.2312 0.674 0.2312 0.2312 0.694 0.2312 0.2312 0.714 0.2312 0.2312 0.735 0.2312 0.2312 0.755 0.2312 0.2312 0.776 0.2312 0.2312 0.796 0.2312 0.2312 0.816 0.2312 0.2312 0.837 0.2312 0.2312 0.857 0.2312 0.2312 0.878 0.2312 0.2312 0.898 0.2312 0.2312 0.918 0.2312 0.2312 0.939 0.2312 0.2312 0.959 0.2312 0.2312 0.980 0.2312 0.2312 1.000 0.2312 0.2312 The minimum error using cross validation is: 0.1347 with alpha : 0.0001
Optimal alpha for Lasso Regression: 0.0001
Ridge Regression
The optimal alpha value for Ridge regression is found to be approximately 2.25. This suggests that some amount of regularization is beneficial for this dataset, as the optimal alpha is greater than 0. The RMSE for both training and cross-validation is relatively low, indicating that the Ridge regression model is fitting the data well without overfitting.
Lasso Regression
The optimal alpha value for Lasso regression is found to be 0.0001, which is very close to zero. This indicates that Lasso regression is not applying much regularization to this dataset. In other words, the model is behaving similarly to a standard linear regression model. The RMSE values for Lasso regression are also relatively low, similar to those of Ridge regression.
Conclusion
Both Ridge and Lasso regression models are performing well on this dataset, with low RMSE values for training and cross-validation. The optimal alpha value for Ridge regression suggests that some regularization is beneficial, while the optimal alpha value for Lasso regression indicates that regularization is not significantly improving the model's performance. This might suggest that the dataset does not have high multicollinearity or that the features are all relevant to the target variable. In such cases, regularization might not provide a significant improvement over standard linear regression.
PART E: [10 pts] Next, perform regression using Stochastic Gradient Descent Regressor from scikit-learn (again use the original train-test split in part (a). Note that SGDRegessor requires that features be standardized (with 0 mean and scaled by standard deviation). Prior to fiting the model, perform the scaling using StandardScaler from sklearn.preprocessing. For this problem, perform a grid search (using GridSearchCV from sklearn.grid_search). Your grid search should compare combinations of two penalty parameters ('l2', 'l1') and different values of alpha (alpha could vary from 0.0001 which is the default to relatively large values, say 10). Using the best parameters, train the model on the full training partition and apply the model to the set-aside test data, comparing traning and test RMSE scores. Finally, perform model optimization (similar to part d, above) to find the best "l1_ratio" parameter using SGDRegressor with the "elasticnet" penalty parameter. [Note: "l1_ratio" is The Elastic Net mixing parameter, with 0 <= l1_ratio <= 1; l1_ratio=0 corresponds to L2 penalty, l1_ratio=1 to L1 penalty; defaults to 0.15.] Using the best mixing ratio, apply the Elastic Net model, trained on full training data, to the set-aside test data and compare to the training perfromance. Provide a brief summary of your findings from the above experiments.¶
scaler = StandardScaler()
X_train_scaled = scaler.fit_transform(X_train)
X_test_scaled = scaler.transform(X_test)
param_grid = {
'penalty': ['l2', 'l1'],
'alpha': [0.0001, 0.001, 0.01, 0.1, 1, 10]
}
sgd = SGDRegressor(max_iter=1000, tol=1e-3, random_state=33)
grid_search = GridSearchCV(sgd, param_grid, cv=5, scoring='neg_root_mean_squared_error')
grid_search.fit(X_train_scaled, y_train)
best_params = grid_search.best_params_
print(f"Best parameters: {best_params}")
Best parameters: {'alpha': 0.001, 'penalty': 'l1'}
sgd_best = SGDRegressor(**best_params, max_iter=1000, tol=1e-3, random_state=33)
sgd_best.fit(X_train_scaled, y_train)
# Training RMSE
p_train_sgd = sgd_best.predict(X_train_scaled)
rmse_train_sgd = np.sqrt(mean_squared_error(y_train, p_train_sgd))
print(f"RMSE for SGDRegressor on training set: {rmse_train_sgd}")
# Test RMSE
p_test_sgd = sgd_best.predict(X_test_scaled)
rmse_test_sgd = np.sqrt(mean_squared_error(y_test, p_test_sgd))
print(f"RMSE for SGDRegressor on test set: {rmse_test_sgd}")
RMSE for SGDRegressor on training set: 0.13199069423798254 RMSE for SGDRegressor on test set: 0.1468338635144639
param_grid_elasticnet = {
'penalty': ['elasticnet'],
'alpha': [0.0001, 0.001, 0.01, 0.1, 1, 10],
'l1_ratio': np.linspace(0, 1, 10)
}
sgd_elasticnet = SGDRegressor(max_iter=1000, tol=1e-3, random_state=33)
grid_search_elasticnet = GridSearchCV(sgd_elasticnet, param_grid_elasticnet, cv=5, scoring='neg_root_mean_squared_error')
grid_search_elasticnet.fit(X_train_scaled, y_train)
best_params_elasticnet = grid_search_elasticnet.best_params_
print(f"Best parameters for Elastic Net: {best_params_elasticnet}")
Best parameters for Elastic Net: {'alpha': 0.01, 'l1_ratio': 0.3333333333333333, 'penalty': 'elasticnet'}
sgd_elasticnet_best = SGDRegressor(**best_params_elasticnet, max_iter=1000, tol=1e-3, random_state=33)
sgd_elasticnet_best.fit(X_train_scaled, y_train)
# Training RMSE
p_train_elasticnet = sgd_elasticnet_best.predict(X_train_scaled)
rmse_train_elasticnet = np.sqrt(mean_squared_error(y_train, p_train_elasticnet))
print(f"RMSE for Elastic Net SGDRegressor on training set: {rmse_train_elasticnet}")
# Test RMSE
p_test_elasticnet = sgd_elasticnet_best.predict(X_test_scaled)
rmse_test_elasticnet = np.sqrt(mean_squared_error(y_test, p_test_elasticnet))
print(f"RMSE for Elastic Net SGDRegressor on test set: {rmse_test_elasticnet}")
RMSE for Elastic Net SGDRegressor on training set: 0.13309768382002807 RMSE for Elastic Net SGDRegressor on test set: 0.14589704444508042
2. Automatic Document Clustering [Dataset]¶
- Automatic Document Clustering [Dataset: newsgroups5.zip]
For this problem you will use a different subset of the 20 Newsgroup data set that you used in Assignment 2 (see the description of the full dataset). The subset for this assignment includes 2,500 documents (newsgroup posts), each belonging to one of 5 categories windows (0), crypt (1), christian (2), hockey (3), forsale (4). The documents are represented by 9328 terms (stems). The dictionary (vocabulary) for the data set is given in the file "terms.txt" and the full term-by-document matrix is given in "matrix.txt" (comma separated values). The actual category labels for the documents are provided in the file "classes.txt". Your goal in this assignment is to perform clustering on the documents and compare the clusters to the actual categories. Your tasks in this problem are the following [Note: for the clustering part of this assignment you should use the kMeans module form Ch. 10 of MLA (use the version provided here as it includes some corrections to the book version). Do not use the KMeans clustering function in scikit-learn. You may use Pandas and other modules from scikit-learn that you may need for preprocessing or evaluation.]
PART A:[5 pts] Create your own distance function that, instead of using Euclidean distance, uses Cosine similarity. This is the distance function you will use to pass to the kMeans function in the included module. Note: you should not use external function for computing Cosine. Write your own version that computes Cosine similarity between two n-dimentional vectors and returns the inverse as the distance between these vectors.
data_matrix = pd.read_csv('N:/Programming machine learning/assignment3/communities/newsgroups5/matrix.txt', sep=',', header=None).values
data_terms = pd.read_csv('N:/Programming machine learning/assignment3/communities/newsgroups5/terms.txt', sep=',', header=None).values
data_classes = pd.read_csv('N:/Programming machine learning/assignment3/communities/newsgroups5/classes.txt', sep=',', header=None).values
def cosine_similarity_distance(vector1, vector2):
# Calculate the dot product of the two vectors
dot_product = np.dot(vector1, vector2)
# Calculate the L2 (Euclidean) norms of both vectors
norm_vector1 = np.linalg.norm(vector1)
norm_vector2 = np.linalg.norm(vector2)
# Calculate the Cosine similarity
cosine_similarity = dot_product / (norm_vector1 * norm_vector2)
# Convert the similarity to distance (inverse of similarity)
distance = 1 - cosine_similarity
return distance
vector1 = np.array([0.2, 0.8, 0.5, 0.1])
vector2 = np.array([0.5, 0.2, 0.7, 0.3])
distance = cosine_similarity_distance(vector1, vector2)
print(f"Distance between vector1 and vector2: {distance}")
Distance between vector1 and vector2: 0.2922882260402736
PART B:[10 pts] Load the data set [Note: the data matrix provided has terms as rows and documents as columns. Since you will be clustering documents, you'll need to take the transpose of this matrix so that your main data matrix is a document x term matrix. In Numpy, you may use the ".T" operation to obtain the transpose.] Then, use the train_test_split function (with random_state = 99) to perform a randomized split the data set (the document by term matrix) and set aside 20% for later use (see below). Use the 80% segment for clustering in the next part. Next, as in the previous assignment, perform TFxIDF transformation on these data sets. [Note: if you have difficulty with TFxIDF conversion, then use the original non-transformed data for the remainder of this assignment].
# Take the transpose to make it a document x term matrix
data_matrix = data_matrix.T
data_matrix
array([[0, 0, 0, ..., 0, 0, 0],
[0, 0, 0, ..., 0, 0, 0],
[0, 0, 0, ..., 0, 0, 0],
...,
[0, 0, 0, ..., 0, 0, 0],
[0, 0, 0, ..., 0, 0, 0],
[0, 0, 0, ..., 0, 0, 0]], dtype=int64)
Split the data: Use the train_test_split function from scikit-learn to split the data into training (80%) and testing (20%) sets
# Split the data into training and testing sets
X_train, X_test = train_test_split(data_matrix, test_size=0.2, random_state=99)
Transform the data using TFxIDF. If you have a function for this transformation, you can apply it to the training and testing sets
# Create a TfidfTransformer object
transformer = TfidfTransformer()
# Fit and transform the training data
X_train_tfidf = transformer.fit_transform(X_train)
# Transform the testing data
X_test_tfidf = transformer.transform(X_test)
PART C:[20 pts] Perform Kmeans clustering on the transformed training data from part (b) Perform a qualitative analysis of the clusters by examining top features in each cluster and identifying patterns in the data. To facilitate your analysis of the clusters, write a function to display the top N terms in each cluster sorted by decreasing centroid weights for each term in the cluster (mean TFxIDF frequency of the term). Your output should also display the cluster DF value for the top N terms. The cluster DF value for a term t in a cluster C is the percentage of docs in cluster C in which term t appears (so, if a cluster has 500 documents, and term "game" appears in 100 of those 500 documents, then DF value of "game" in that cluster is 0.2 or 20%). For each cluster, you should also display the cluster size (the nunber of documents in the cluster). Here is an example of how this output might look like (here the top 10 terms for a sample of clusters are displayed in decreasing order of mean TFxIDF weights from the cluster centroids (the "Freq" column), but in addition the cluster DF values (both raw and as a percentage) are also shown).
Important Note: for this problem you should try several values of k for the number of clusters (try values of k from 4 through 8) and in each case try several runs in order to obtain clusters that seem more meaningful. In some cases, you may find some small clusters containing noise documents, which is not unusual. The point is to experiment with different runs and cluster numbers until you find at least several clusters that seem to capture some of the key topics in the documents. You do not need to provide the results of all your runs; you should only provide the results of your best clustering along with a brief discussion of your experimentation and your final observations.
kmeans = KMeans(n_clusters=5, random_state=99)
kmeans.fit(X_train_tfidf)
labels = kmeans.labels_
def display_top_terms_per_cluster(features, centroids, labels, n_terms=10):
for cluster_num in range(centroids.shape[0]):
print(f"Cluster {cluster_num} size= {sum(labels == cluster_num)}")
print("-" * 40)
print(f"{'':<15}{'Freq':<10}{'DF':<10}% of Docs")
centroid = centroids[cluster_num]
sorted_indices = centroid.argsort()[::-1][:n_terms]
for index in sorted_indices:
term = features[index]
# Calculate DF value for the term in this cluster
df_value = sum(1 for i, label in enumerate(labels) if label == cluster_num and X_train_tfidf[i, index] > 0)
df_percent = df_value / sum(labels == cluster_num)
print(f"{term:<15}{centroid[index]:<10.6f}{df_value:<10}{df_percent:<10.6f}")
print()
display_top_terms_per_cluster(data_terms.flatten(), kmeans.cluster_centers_, labels, n_terms=10)
Cluster 0 size= 359
----------------------------------------
Freq DF % of Docs
game 0.074576 198 0.551532
team 0.054066 179 0.498607
plai 0.043831 160 0.445682
hockei 0.039986 163 0.454039
go 0.037329 171 0.476323
player 0.035517 121 0.337047
nhl 0.030692 107 0.298050
fan 0.028936 104 0.289694
espn 0.027016 43 0.119777
playoff 0.026973 105 0.292479
Cluster 1 size= 305
----------------------------------------
Freq DF % of Docs
god 0.092441 212 0.695082
christian 0.068728 168 0.550820
sin 0.043591 84 0.275410
jesu 0.040305 105 0.344262
church 0.035935 85 0.278689
believ 0.033321 148 0.485246
peopl 0.033023 160 0.524590
bibl 0.032189 102 0.334426
on 0.031414 191 0.626230
hell 0.029744 38 0.124590
Cluster 2 size= 242
----------------------------------------
Freq DF % of Docs
kei 0.111888 166 0.685950
chip 0.090441 154 0.636364
clipper 0.074167 163 0.673554
encrypt 0.065273 138 0.570248
govern 0.057246 117 0.483471
escrow 0.046510 82 0.338843
secur 0.041069 95 0.392562
algorithm 0.038115 82 0.338843
phone 0.036483 69 0.285124
netcomcom 0.031128 66 0.272727
Cluster 3 size= 310
----------------------------------------
Freq DF % of Docs
window 0.133607 267 0.861290
file 0.070567 138 0.445161
driver 0.055992 81 0.261290
do 0.050080 116 0.374194
program 0.037508 100 0.322581
run 0.034618 111 0.358065
mous 0.029950 34 0.109677
problem 0.028257 93 0.300000
disk 0.024578 50 0.161290
card 0.024252 51 0.164516
Cluster 4 size= 784
----------------------------------------
Freq DF % of Docs
sale 0.036235 240 0.306122
subject 0.026739 784 1.000000
email 0.021423 210 0.267857
pleas 0.018601 181 0.230867
offer 0.016964 122 0.155612
drive 0.016054 64 0.081633
on 0.014611 206 0.262755
ship 0.014593 98 0.125000
thank 0.014281 146 0.186224
interest 0.014104 142 0.181122
Here i did try k=8 as given in the question to experiment and the results were unsual¶
kmeans = KMeans(n_clusters=8, random_state=99, n_init=10)
kmeans.fit(X_train_tfidf)
labels = kmeans.labels_
def display_top_terms_per_cluster(features, centroids, labels, n_terms=10):
for cluster_num in range(centroids.shape[0]):
print(f"Cluster {cluster_num} size= {sum(labels == cluster_num)}")
print("-" * 40)
print(f"{'':<15}{'Freq':<10}{'DF':<10}% of Docs")
centroid = centroids[cluster_num]
sorted_indices = centroid.argsort()[::-1][:n_terms]
for index in sorted_indices:
term = features[index]
# Calculate DF value for the term in this cluster
df_value = sum(1 for i, label in enumerate(labels) if label == cluster_num and X_train_tfidf[i, index] > 0)
df_percent = df_value / sum(labels == cluster_num)
print(f"{term:<15}{centroid[index]:<10.6f}{df_value:<10}{df_percent:<10.6f}")
print()
display_top_terms_per_cluster(data_terms.flatten(), kmeans.cluster_centers_, labels, n_terms=10)
Cluster 0 size= 455
----------------------------------------
Freq DF % of Docs
subject 0.027096 455 1.000000
email 0.017190 92 0.202198
write 0.017094 176 0.386813
know 0.015988 120 0.263736
on 0.015911 135 0.296703
pleas 0.015171 79 0.173626
articl 0.014609 127 0.279121
want 0.014189 86 0.189011
thank 0.013902 81 0.178022
mail 0.013789 49 0.107692
Cluster 1 size= 142
----------------------------------------
Freq DF % of Docs
kei 0.169988 126 0.887324
chip 0.118957 106 0.746479
clipper 0.078049 91 0.640845
encrypt 0.076432 88 0.619718
escrow 0.064019 61 0.429577
secur 0.049128 62 0.436620
de 0.042739 42 0.295775
algorithm 0.042643 52 0.366197
phone 0.041096 46 0.323944
govern 0.037244 56 0.394366
Cluster 2 size= 358
----------------------------------------
Freq DF % of Docs
game 0.075064 199 0.555866
team 0.053673 178 0.497207
plai 0.043953 160 0.446927
hockei 0.040435 164 0.458101
go 0.036913 169 0.472067
player 0.035616 121 0.337989
nhl 0.030140 106 0.296089
fan 0.029017 104 0.290503
playoff 0.027165 105 0.293296
espn 0.027092 43 0.120112
Cluster 3 size= 295
----------------------------------------
Freq DF % of Docs
god 0.094484 206 0.698305
christian 0.071058 168 0.569492
sin 0.044543 82 0.277966
jesu 0.040808 103 0.349153
church 0.036112 82 0.277966
believ 0.034451 148 0.501695
peopl 0.033811 158 0.535593
bibl 0.033049 101 0.342373
on 0.031344 184 0.623729
hell 0.030753 38 0.128814
Cluster 4 size= 219
----------------------------------------
Freq DF % of Docs
window 0.157338 198 0.904110
file 0.090164 120 0.547945
do 0.062320 99 0.452055
program 0.048941 89 0.406393
run 0.044895 98 0.447489
disk 0.033885 47 0.214612
manag 0.031702 45 0.205479
os 0.028044 24 0.109589
nt 0.027094 21 0.095890
problem 0.027072 62 0.283105
Cluster 5 size= 156
----------------------------------------
Freq DF % of Docs
govern 0.063948 79 0.506410
clipper 0.051065 88 0.564103
sternlight 0.047114 49 0.314103
netcomcom 0.046829 58 0.371795
david 0.046116 53 0.339744
nsa 0.040349 48 0.307692
encrypt 0.037854 66 0.423077
tap 0.036240 52 0.333333
amanda 0.035140 14 0.089744
write 0.033079 127 0.814103
Cluster 6 size= 78
----------------------------------------
Freq DF % of Docs
driver 0.208662 66 0.846154
card 0.101605 41 0.525641
mous 0.085186 14 0.179487
printer 0.084817 18 0.230769
window 0.065372 48 0.615385
video 0.051947 26 0.333333
diamond 0.050083 19 0.243590
ati 0.043994 15 0.192308
problem 0.032061 24 0.307692
get 0.028813 32 0.410256
Cluster 7 size= 297
----------------------------------------
Freq DF % of Docs
sale 0.090072 213 0.717172
offer 0.040093 102 0.343434
drive 0.036663 52 0.175084
ship 0.035265 85 0.286195
price 0.031214 78 0.262626
email 0.027436 104 0.350168
subject 0.027331 297 1.000000
condit 0.026853 72 0.242424
sell 0.025818 74 0.249158
pleas 0.025464 101 0.340067
[Extra Credit - 5 pts: use your favorite third party tool or library, ideally with a Python based API, to create a word cloud for each cluster (using your best clustering from earlier experiments.]
def create_word_clouds(features, centroids, n_clusters):
for i in range(n_clusters):
# Create a dictionary with words and their weights in the cluster centroid
words = {features[j]: centroids[i, j] for j in centroids[i].argsort()[::-1]}
# Create and generate a word cloud image
wordcloud = WordCloud(width=800, height=400, background_color='white').generate_from_frequencies(words)
# Display the word cloud
plt.figure(figsize=(10, 5))
plt.imshow(wordcloud, interpolation='bilinear')
plt.axis('off')
plt.title(f'Cluster {i} Word Cloud')
plt.show()
create_word_clouds(data_terms.flatten(), kmeans.cluster_centers_, 8)
PART D: [5 pts] Using the cluster assignments from your Kmeans clustering and the original cluster labels for the training document, compare your clusters to the re-assigned classes by computing the Completeness and Homogeneity values. You should do this for the best values of k and the best clustering run you settled on in the previous part.
# Assuming 'X' is your full dataset and 'y' is the corresponding labels
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=99)
# Perform K-means clustering on X_train
kmeans = KMeans(n_clusters=5, random_state=99, n_init=10)
kmeans.fit(X_train)
cluster_labels = kmeans.labels_
# Calculate Completeness and Homogeneity scores
completeness = completeness_score(y_train, cluster_labels)
homogeneity = homogeneity_score(y_train, cluster_labels)
print("Completeness Score:", completeness)
print("Homogeneity Score:", homogeneity)
Completeness Score: 0.21051589375295532 Homogeneity Score: 0.08000210191249067
[Extra Credit - 5 pts: Try several other clustering runs each time with values of k ranging between 4 and 8 and in each case compute Completeness and Homogeneity. This experiment will indicate which clustering provides the best representation of the original newsgroup categories. Provide a brief report of your experiment including a comparison of final results for at least three different runs.]
# Assuming 'X' is your full dataset and 'y' is the corresponding labels
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=99)
# Range of k values to try
k_values = range(4, 9)
# Store the results
results = []
# Perform clustering for each value of k and compute scores
for k in k_values:
kmeans = KMeans(n_clusters=k, random_state=99, n_init=10)
kmeans.fit(X_train)
cluster_labels = kmeans.labels_
completeness = completeness_score(y_train, cluster_labels)
homogeneity = homogeneity_score(y_train, cluster_labels)
results.append((k, completeness, homogeneity))
# Print the results
print("k, Completeness, Homogeneity")
for k, completeness, homogeneity in results:
print(f"{k}, {completeness:.4f}, {homogeneity:.4f}")
# Brief report
best_result = max(results, key=lambda x: (x[1] + x[2]))
print(f"\nThe best clustering representation of the original newsgroup categories is achieved with k = {best_result[0]},")
print(f"where Completeness = {best_result[1]:.4f} and Homogeneity = {best_result[2]:.4f}.")
print("\nThis indicates that this particular clustering provides a good balance between having all members of a class in the same cluster (completeness) and having each cluster contain only members of a single class (homogeneity).")
k, Completeness, Homogeneity 4, 0.2137, 0.0688 5, 0.2105, 0.0800 6, 0.2286, 0.0956 7, 0.2349, 0.1054 8, 0.2475, 0.1202 The best clustering representation of the original newsgroup categories is achieved with k = 8, where Completeness = 0.2475 and Homogeneity = 0.1202. This indicates that this particular clustering provides a good balance between having all members of a class in the same cluster (completeness) and having each cluster contain only members of a single class (homogeneity).
PART E:[10 pts] Finally, using your cluster assignments as class labels, categorize each of the documents in the 20% set-aside data into each of the appropriate clusters (using your final clustering results in part c). Your categorization should be based on Cosine similarity between each test document and cluster centroids. For each test document show the assigned cluster label as well as Cosine similarity to the corresponding cluster.
# Calculate the cosine similarity between test documents and cluster centroids
cosine_similarities = cosine_similarity(X_test_tfidf, kmeans.cluster_centers_)
# Assign each document to the cluster with the highest similarity
assigned_clusters = np.argmax(cosine_similarities, axis=1)
# Create a DataFrame to display the results
results_df = pd.DataFrame(cosine_similarities, columns=[f'document cosine similarity to cluster: {i}' for i in range(kmeans.n_clusters)])
results_df['cluster prediction'] = assigned_clusters
# Set pandas display options to show all rows
pd.set_option('display.max_rows', None)
results_df
| document cosine similarity to cluster: 0 | document cosine similarity to cluster: 1 | document cosine similarity to cluster: 2 | document cosine similarity to cluster: 3 | document cosine similarity to cluster: 4 | cluster prediction | |
|---|---|---|---|---|---|---|
| 0 | 0.000000 | 0.001248 | 0.000833 | 0.000588 | 0.020660 | 4 |
| 1 | 0.000000 | 0.012175 | 0.000927 | 0.001617 | 0.050094 | 4 |
| 2 | 0.001628 | 0.109369 | 0.025236 | 0.004617 | 0.162311 | 4 |
| 3 | 0.000000 | 0.016790 | 0.009358 | 0.000676 | 0.065156 | 4 |
| 4 | 0.028939 | 0.095960 | 0.005492 | 0.007459 | 0.126264 | 4 |
| 5 | 0.005062 | 0.004905 | 0.001082 | 0.018552 | 0.008221 | 3 |
| 6 | 0.000302 | 0.097846 | 0.004828 | 0.001609 | 0.093919 | 1 |
| 7 | 0.001497 | 0.039427 | 0.129605 | 0.005318 | 0.178417 | 4 |
| 8 | 0.171159 | 0.015345 | 0.012868 | 0.801010 | 0.073156 | 3 |
| 9 | 0.000141 | 0.026828 | 0.006934 | 0.002398 | 0.049239 | 4 |
| 10 | 0.002916 | 0.178271 | 0.008661 | 0.006180 | 0.182826 | 4 |
| 11 | 0.000350 | 0.035850 | 0.002056 | 0.001090 | 0.117572 | 4 |
| 12 | 0.001015 | 0.134890 | 0.002756 | 0.003440 | 0.103529 | 1 |
| 13 | 0.002515 | 0.738199 | 0.004282 | 0.002919 | 0.088072 | 1 |
| 14 | 0.003718 | 0.065931 | 0.013341 | 0.015155 | 0.185016 | 4 |
| 15 | 0.000979 | 0.007476 | 0.000927 | 0.000305 | 0.036918 | 4 |
| 16 | 0.000503 | 0.025819 | 0.105304 | 0.001041 | 0.054494 | 2 |
| 17 | 0.192541 | 0.013627 | 0.010797 | 0.829920 | 0.055282 | 3 |
| 18 | 0.000000 | 0.012771 | 0.001036 | 0.001829 | 0.041793 | 4 |
| 19 | 0.000159 | 0.021661 | 0.001820 | 0.000851 | 0.068473 | 4 |
| 20 | 0.002567 | 0.202408 | 0.019600 | 0.006492 | 0.299570 | 4 |
| 21 | 0.000000 | 0.008722 | 0.003947 | 0.000611 | 0.045643 | 4 |
| 22 | 0.000000 | 0.033921 | 0.003483 | 0.004138 | 0.042473 | 4 |
| 23 | 0.002504 | 0.382658 | 0.021574 | 0.004037 | 0.138313 | 1 |
| 24 | 0.000097 | 0.015430 | 0.000462 | 0.001482 | 0.038797 | 4 |
| 25 | 0.052511 | 0.002743 | 0.001788 | 0.153830 | 0.096546 | 3 |
| 26 | 0.002182 | 0.058309 | 0.171942 | 0.006629 | 0.298018 | 4 |
| 27 | 0.002993 | 0.739464 | 0.004477 | 0.007037 | 0.136615 | 1 |
| 28 | 0.000092 | 0.019885 | 0.000941 | 0.001913 | 0.059226 | 4 |
| 29 | 0.000388 | 0.005975 | 0.000639 | 0.001929 | 0.019926 | 4 |
| 30 | 0.000797 | 0.014174 | 0.174508 | 0.000988 | 0.076687 | 2 |
| 31 | 0.001771 | 0.036170 | 0.000912 | 0.001687 | 0.084663 | 4 |
| 32 | 0.000000 | 0.050283 | 0.001686 | 0.002035 | 0.096873 | 4 |
| 33 | 0.000881 | 0.082908 | 0.002942 | 0.003362 | 0.102027 | 4 |
| 34 | 0.000180 | 0.011362 | 0.000889 | 0.000384 | 0.048084 | 4 |
| 35 | 0.703964 | 0.013212 | 0.008903 | 0.658739 | 0.053855 | 0 |
| 36 | 0.000452 | 0.020694 | 0.000882 | 0.000650 | 0.036412 | 4 |
| 37 | 0.001161 | 0.171493 | 0.005258 | 0.003615 | 0.129514 | 1 |
| 38 | 0.003953 | 0.351868 | 0.014579 | 0.015555 | 0.375137 | 4 |
| 39 | 0.000570 | 0.013978 | 0.002230 | 0.000812 | 0.065528 | 4 |
| 40 | 0.000489 | 0.016993 | 0.001316 | 0.000000 | 0.045200 | 4 |
| 41 | 0.134708 | 0.026549 | 0.009178 | 0.637367 | 0.063427 | 3 |
| 42 | 0.000879 | 0.023501 | 0.032251 | 0.003202 | 0.045407 | 4 |
| 43 | 0.294046 | 0.011058 | 0.011696 | 0.824551 | 0.054028 | 3 |
| 44 | 0.001275 | 0.016190 | 0.010111 | 0.003786 | 0.123202 | 4 |
| 45 | 0.000134 | 0.015466 | 0.014443 | 0.000627 | 0.051459 | 4 |
| 46 | 0.000000 | 0.030310 | 0.001668 | 0.000877 | 0.043926 | 4 |
| 47 | 0.000321 | 0.132422 | 0.002015 | 0.000829 | 0.073778 | 1 |
| 48 | 0.000514 | 0.042033 | 0.010835 | 0.000805 | 0.092571 | 4 |
| 49 | 0.000235 | 0.024695 | 0.000500 | 0.001700 | 0.071202 | 4 |
| 50 | 0.000503 | 0.010580 | 0.000247 | 0.000247 | 0.025596 | 4 |
| 51 | 0.000529 | 0.040748 | 0.001364 | 0.000470 | 0.076682 | 4 |
| 52 | 0.011807 | 0.107693 | 0.067284 | 0.049300 | 0.278276 | 4 |
| 53 | 0.000753 | 0.005154 | 0.016493 | 0.000853 | 0.038925 | 4 |
| 54 | 0.000000 | 0.002526 | 0.695497 | 0.000000 | 0.010779 | 2 |
| 55 | 0.003633 | 0.043513 | 0.000977 | 0.000740 | 0.098520 | 4 |
| 56 | 0.000623 | 0.024467 | 0.008801 | 0.005801 | 0.178902 | 4 |
| 57 | 0.001625 | 0.014216 | 0.003455 | 0.002253 | 0.061232 | 4 |
| 58 | 0.000427 | 0.032470 | 0.120977 | 0.002815 | 0.175305 | 4 |
| 59 | 0.001125 | 0.141548 | 0.001298 | 0.001018 | 0.033835 | 1 |
| 60 | 0.000000 | 0.032714 | 0.000234 | 0.000872 | 0.095801 | 4 |
| 61 | 0.000101 | 0.013734 | 0.002824 | 0.002366 | 0.051917 | 4 |
| 62 | 0.000000 | 0.007248 | 0.000000 | 0.001242 | 0.018915 | 4 |
| 63 | 0.000000 | 0.004080 | 0.004821 | 0.000000 | 0.023044 | 4 |
| 64 | 0.001123 | 0.012358 | 0.008783 | 0.001102 | 0.066155 | 4 |
| 65 | 0.674781 | 0.008007 | 0.007400 | 0.278610 | 0.042839 | 0 |
| 66 | 0.001537 | 0.177628 | 0.003347 | 0.004584 | 0.147108 | 1 |
| 67 | 0.000519 | 0.052858 | 0.046530 | 0.003806 | 0.143960 | 4 |
| 68 | 0.000107 | 0.018910 | 0.003443 | 0.000197 | 0.048072 | 4 |
| 69 | 0.000243 | 0.034330 | 0.001708 | 0.001263 | 0.030365 | 1 |
| 70 | 0.001634 | 0.132175 | 0.162503 | 0.003500 | 0.112380 | 2 |
| 71 | 0.000000 | 0.001464 | 0.006023 | 0.000254 | 0.020212 | 4 |
| 72 | 0.000000 | 0.013169 | 0.000800 | 0.001415 | 0.030755 | 4 |
| 73 | 0.000113 | 0.032865 | 0.000935 | 0.001621 | 0.077648 | 4 |
| 74 | 0.000000 | 0.032647 | 0.002927 | 0.002726 | 0.071455 | 4 |
| 75 | 0.001647 | 0.044178 | 0.007607 | 0.002495 | 0.137995 | 4 |
| 76 | 0.000242 | 0.038971 | 0.000932 | 0.001317 | 0.061082 | 4 |
| 77 | 0.000000 | 0.004636 | 0.003558 | 0.000812 | 0.025899 | 4 |
| 78 | 0.001531 | 0.408000 | 0.002267 | 0.004689 | 0.067182 | 1 |
| 79 | 0.000000 | 0.006680 | 0.000411 | 0.000927 | 0.046425 | 4 |
| 80 | 0.008828 | 0.056854 | 0.026659 | 0.041310 | 0.119799 | 4 |
| 81 | 0.000345 | 0.031146 | 0.002477 | 0.002062 | 0.081745 | 4 |
| 82 | 0.001554 | 0.155989 | 0.015782 | 0.002151 | 0.110922 | 1 |
| 83 | 0.001650 | 0.096556 | 0.004578 | 0.010434 | 0.143684 | 4 |
| 84 | 0.000220 | 0.028759 | 0.000000 | 0.000000 | 0.044794 | 4 |
| 85 | 0.000530 | 0.065836 | 0.006058 | 0.000630 | 0.098135 | 4 |
| 86 | 0.000000 | 0.007968 | 0.003360 | 0.001824 | 0.031459 | 4 |
| 87 | 0.000000 | 0.011947 | 0.000329 | 0.001531 | 0.031035 | 4 |
| 88 | 0.000473 | 0.059479 | 0.001982 | 0.002596 | 0.187307 | 4 |
| 89 | 0.000000 | 0.003427 | 0.001874 | 0.001276 | 0.014456 | 4 |
| 90 | 0.000000 | 0.014017 | 0.002881 | 0.000000 | 0.039278 | 4 |
| 91 | 0.001273 | 0.201206 | 0.007553 | 0.004606 | 0.176557 | 1 |
| 92 | 0.002005 | 0.406811 | 0.004524 | 0.003914 | 0.093997 | 1 |
| 93 | 0.000024 | 0.014691 | 0.006030 | 0.001020 | 0.042118 | 4 |
| 94 | 0.000151 | 0.009188 | 0.009263 | 0.001028 | 0.030070 | 4 |
| 95 | 0.000904 | 0.063253 | 0.531111 | 0.002190 | 0.075858 | 2 |
| 96 | 0.000211 | 0.006949 | 0.007616 | 0.001390 | 0.046119 | 4 |
| 97 | 0.000000 | 0.004207 | 0.010152 | 0.000582 | 0.029387 | 4 |
| 98 | 0.000000 | 0.001005 | 0.003441 | 0.000145 | 0.016562 | 4 |
| 99 | 0.000845 | 0.080831 | 0.004185 | 0.006997 | 0.127150 | 4 |
| 100 | 0.000249 | 0.005402 | 0.000368 | 0.000387 | 0.023907 | 4 |
| 101 | 0.000399 | 0.023299 | 0.000696 | 0.001252 | 0.088635 | 4 |
| 102 | 0.000838 | 0.021818 | 0.090076 | 0.004493 | 0.116584 | 4 |
| 103 | 0.000838 | 0.129431 | 0.014736 | 0.003633 | 0.120501 | 1 |
| 104 | 0.000460 | 0.025869 | 0.035041 | 0.001434 | 0.057573 | 4 |
| 105 | 0.000429 | 0.018520 | 0.000000 | 0.001215 | 0.039229 | 4 |
| 106 | 0.000000 | 0.002369 | 0.164593 | 0.002237 | 0.016305 | 2 |
| 107 | 0.000662 | 0.038730 | 0.000989 | 0.002646 | 0.050520 | 4 |
| 108 | 0.000145 | 0.023064 | 0.000203 | 0.000588 | 0.057325 | 4 |
| 109 | 0.000000 | 0.024499 | 0.000000 | 0.001051 | 0.060383 | 4 |
| 110 | 0.000292 | 0.015251 | 0.067841 | 0.000685 | 0.096932 | 4 |
| 111 | 0.001438 | 0.043879 | 0.004895 | 0.001107 | 0.043136 | 1 |
| 112 | 0.000464 | 0.022697 | 0.001885 | 0.001500 | 0.093690 | 4 |
| 113 | 0.000936 | 0.055137 | 0.021297 | 0.003451 | 0.132667 | 4 |
| 114 | 0.000308 | 0.009049 | 0.000140 | 0.000311 | 0.016780 | 4 |
| 115 | 0.000000 | 0.002159 | 0.000537 | 0.000730 | 0.010272 | 4 |
| 116 | 0.000365 | 0.010302 | 0.000242 | 0.001027 | 0.027794 | 4 |
| 117 | 0.003581 | 0.126155 | 0.004630 | 0.019015 | 0.097223 | 1 |
| 118 | 0.000000 | 0.011433 | 0.000912 | 0.002503 | 0.045397 | 4 |
| 119 | 0.000000 | 0.010288 | 0.000000 | 0.000000 | 0.025718 | 4 |
| 120 | 0.000000 | 0.007496 | 0.000000 | 0.000685 | 0.021807 | 4 |
| 121 | 0.000233 | 0.006415 | 0.000129 | 0.000645 | 0.025256 | 4 |
| 122 | 0.000000 | 0.002572 | 0.675703 | 0.000000 | 0.014573 | 2 |
| 123 | 0.000000 | 0.004978 | 0.000000 | 0.000433 | 0.018338 | 4 |
| 124 | 0.022607 | 0.001027 | 0.000231 | 0.054837 | 0.106796 | 4 |
| 125 | 0.002722 | 0.009337 | 0.079256 | 0.004660 | 0.127455 | 4 |
| 126 | 0.687062 | 0.010855 | 0.011601 | 0.671466 | 0.051355 | 0 |
| 127 | 0.001959 | 0.008582 | 0.003829 | 0.002652 | 0.063312 | 4 |
| 128 | 0.000858 | 0.005593 | 0.000000 | 0.000487 | 0.020197 | 4 |
| 129 | 0.896723 | 0.003995 | 0.002266 | 0.110594 | 0.018018 | 0 |
| 130 | 0.001515 | 0.451834 | 0.002243 | 0.004004 | 0.110558 | 1 |
| 131 | 0.000731 | 0.016045 | 0.132516 | 0.002161 | 0.110596 | 2 |
| 132 | 0.001571 | 0.018210 | 0.001228 | 0.004294 | 0.048781 | 4 |
| 133 | 0.000647 | 0.033209 | 0.001186 | 0.001402 | 0.038507 | 4 |
| 134 | 0.001179 | 0.072876 | 0.008529 | 0.004288 | 0.111727 | 4 |
| 135 | 0.002356 | 0.090772 | 0.006900 | 0.003280 | 0.110729 | 4 |
| 136 | 0.001625 | 0.449100 | 0.004330 | 0.003991 | 0.116216 | 1 |
| 137 | 0.000468 | 0.030802 | 0.003141 | 0.004533 | 0.140712 | 4 |
| 138 | 0.000761 | 0.011367 | 0.001362 | 0.000000 | 0.030924 | 4 |
| 139 | 0.001662 | 0.295740 | 0.001658 | 0.001934 | 0.046124 | 1 |
| 140 | 0.001376 | 0.095757 | 0.004329 | 0.010101 | 0.153072 | 4 |
| 141 | 0.000351 | 0.010006 | 0.005416 | 0.000330 | 0.046682 | 4 |
| 142 | 0.000204 | 0.013595 | 0.467519 | 0.000000 | 0.091193 | 2 |
| 143 | 0.001322 | 0.246560 | 0.003837 | 0.002689 | 0.154550 | 1 |
| 144 | 0.000821 | 0.010817 | 0.000000 | 0.001397 | 0.027206 | 4 |
| 145 | 0.001247 | 0.300504 | 0.003154 | 0.002057 | 0.089892 | 1 |
| 146 | 0.144575 | 0.004874 | 0.006847 | 0.346340 | 0.073324 | 3 |
| 147 | 0.000760 | 0.097983 | 0.005532 | 0.003132 | 0.089405 | 1 |
| 148 | 0.000315 | 0.012001 | 0.001194 | 0.000000 | 0.042187 | 4 |
| 149 | 0.000882 | 0.028529 | 0.045098 | 0.000846 | 0.089036 | 4 |
| 150 | 0.001475 | 0.258946 | 0.022208 | 0.004220 | 0.143243 | 1 |
| 151 | 0.545075 | 0.003059 | 0.008779 | 0.274882 | 0.029388 | 0 |
| 152 | 0.000952 | 0.170904 | 0.016593 | 0.004298 | 0.169190 | 1 |
| 153 | 0.000468 | 0.114833 | 0.000805 | 0.001216 | 0.047899 | 1 |
| 154 | 0.000990 | 0.036406 | 0.001589 | 0.002378 | 0.081810 | 4 |
| 155 | 0.000610 | 0.029832 | 0.002019 | 0.000310 | 0.046315 | 4 |
| 156 | 0.000000 | 0.018906 | 0.000836 | 0.000000 | 0.046264 | 4 |
| 157 | 0.030881 | 0.675080 | 0.045516 | 0.038124 | 0.343763 | 1 |
| 158 | 0.001175 | 0.028289 | 0.006013 | 0.003267 | 0.076097 | 4 |
| 159 | 0.883033 | 0.007134 | 0.005740 | 0.399546 | 0.032335 | 0 |
| 160 | 0.000215 | 0.026314 | 0.001290 | 0.000863 | 0.056506 | 4 |
| 161 | 0.428036 | 0.015315 | 0.007376 | 0.448607 | 0.045782 | 3 |
| 162 | 0.001034 | 0.067499 | 0.020234 | 0.003092 | 0.168723 | 4 |
| 163 | 0.077505 | 0.004627 | 0.000333 | 0.011359 | 0.016913 | 0 |
| 164 | 0.000376 | 0.013165 | 0.001513 | 0.004613 | 0.105545 | 4 |
| 165 | 0.000000 | 0.008576 | 0.001632 | 0.001303 | 0.065719 | 4 |
| 166 | 0.000000 | 0.062688 | 0.005737 | 0.003698 | 0.123129 | 4 |
| 167 | 0.000000 | 0.018961 | 0.169895 | 0.001716 | 0.083492 | 2 |
| 168 | 0.001851 | 0.443878 | 0.003004 | 0.003225 | 0.073155 | 1 |
| 169 | 0.000388 | 0.019855 | 0.003069 | 0.003335 | 0.059734 | 4 |
| 170 | 0.149798 | 0.020556 | 0.009477 | 0.708821 | 0.073245 | 3 |
| 171 | 0.000552 | 0.057383 | 0.002821 | 0.002313 | 0.152190 | 4 |
| 172 | 0.020717 | 0.012819 | 0.001693 | 0.094410 | 0.025709 | 3 |
| 173 | 0.000000 | 0.004610 | 0.007193 | 0.001337 | 0.019131 | 4 |
| 174 | 0.000813 | 0.018229 | 0.018099 | 0.002023 | 0.075965 | 4 |
| 175 | 0.034619 | 0.037508 | 0.003737 | 0.110004 | 0.164489 | 4 |
| 176 | 0.000180 | 0.011368 | 0.037820 | 0.000890 | 0.048035 | 4 |
| 177 | 0.000883 | 0.048983 | 0.166244 | 0.002930 | 0.152865 | 2 |
| 178 | 0.000000 | 0.003568 | 0.692882 | 0.000000 | 0.011850 | 2 |
| 179 | 0.002948 | 0.266652 | 0.001884 | 0.005512 | 0.074843 | 1 |
| 180 | 0.000620 | 0.013145 | 0.023669 | 0.000585 | 0.067003 | 4 |
| 181 | 0.001708 | 0.019581 | 0.073606 | 0.004342 | 0.125512 | 4 |
| 182 | 0.000000 | 0.021623 | 0.001539 | 0.000839 | 0.075603 | 4 |
| 183 | 0.000138 | 0.029149 | 0.017666 | 0.001147 | 0.083331 | 4 |
| 184 | 0.000301 | 0.015729 | 0.004219 | 0.000421 | 0.057982 | 4 |
| 185 | 0.000149 | 0.009438 | 0.000942 | 0.005468 | 0.039245 | 4 |
| 186 | 0.000178 | 0.013079 | 0.006892 | 0.000149 | 0.041458 | 4 |
| 187 | 0.000604 | 0.040675 | 0.032404 | 0.006242 | 0.183278 | 4 |
| 188 | 0.000745 | 0.014989 | 0.009109 | 0.000784 | 0.051411 | 4 |
| 189 | 0.002325 | 0.169272 | 0.004417 | 0.002326 | 0.066367 | 1 |
| 190 | 0.000807 | 0.009627 | 0.001667 | 0.001225 | 0.033208 | 4 |
| 191 | 0.000339 | 0.018121 | 0.026716 | 0.000833 | 0.065007 | 4 |
| 192 | 0.043478 | 0.004740 | 0.005771 | 0.205952 | 0.040494 | 3 |
| 193 | 0.000348 | 0.010795 | 0.001602 | 0.000598 | 0.049916 | 4 |
| 194 | 0.000000 | 0.023181 | 0.000688 | 0.001443 | 0.061249 | 4 |
| 195 | 0.202286 | 0.014154 | 0.008530 | 0.599408 | 0.053828 | 3 |
| 196 | 0.000276 | 0.025794 | 0.000578 | 0.000990 | 0.056639 | 4 |
| 197 | 0.002136 | 0.243915 | 0.022375 | 0.007721 | 0.363787 | 4 |
| 198 | 0.000000 | 0.022511 | 0.001932 | 0.000491 | 0.048266 | 4 |
| 199 | 0.000252 | 0.116854 | 0.004725 | 0.002915 | 0.143034 | 4 |
| 200 | 0.685117 | 0.001497 | 0.005191 | 0.176316 | 0.023678 | 0 |
| 201 | 0.000000 | 0.039428 | 0.002820 | 0.002228 | 0.097961 | 4 |
| 202 | 0.554608 | 0.018398 | 0.011818 | 0.823882 | 0.099316 | 3 |
| 203 | 0.000129 | 0.001166 | 0.000478 | 0.000394 | 0.012285 | 4 |
| 204 | 0.000161 | 0.016343 | 0.001002 | 0.001680 | 0.051537 | 4 |
| 205 | 0.000716 | 0.008856 | 0.014902 | 0.002901 | 0.066506 | 4 |
| 206 | 0.000000 | 0.006766 | 0.000000 | 0.000547 | 0.034628 | 4 |
| 207 | 0.000443 | 0.018672 | 0.001208 | 0.000982 | 0.036602 | 4 |
| 208 | 0.000086 | 0.023335 | 0.000795 | 0.002017 | 0.054789 | 4 |
| 209 | 0.000704 | 0.040294 | 0.027632 | 0.001063 | 0.093951 | 4 |
| 210 | 0.000000 | 0.001445 | 0.002046 | 0.000391 | 0.020499 | 4 |
| 211 | 0.001833 | 0.245782 | 0.003285 | 0.003443 | 0.127631 | 1 |
| 212 | 0.002962 | 0.032867 | 0.013913 | 0.005196 | 0.145979 | 4 |
| 213 | 0.001828 | 0.054816 | 0.007073 | 0.003727 | 0.178740 | 4 |
| 214 | 0.001893 | 0.444106 | 0.003495 | 0.005258 | 0.125238 | 1 |
| 215 | 0.000620 | 0.027696 | 0.008587 | 0.002411 | 0.095867 | 4 |
| 216 | 0.001200 | 0.185593 | 0.003422 | 0.003461 | 0.118550 | 1 |
| 217 | 0.001454 | 0.062906 | 0.001671 | 0.000665 | 0.071539 | 4 |
| 218 | 0.000895 | 0.197474 | 0.004254 | 0.002206 | 0.044053 | 1 |
| 219 | 0.000542 | 0.096124 | 0.009257 | 0.001679 | 0.097169 | 4 |
| 220 | 0.000000 | 0.003561 | 0.005058 | 0.000496 | 0.023505 | 4 |
| 221 | 0.001672 | 0.033260 | 0.008617 | 0.005881 | 0.120640 | 4 |
| 222 | 0.018751 | 0.243970 | 0.008217 | 0.086975 | 0.137307 | 1 |
| 223 | 0.000000 | 0.002716 | 0.000138 | 0.000674 | 0.030517 | 4 |
| 224 | 0.000495 | 0.023324 | 0.002448 | 0.002442 | 0.085450 | 4 |
| 225 | 0.000194 | 0.027454 | 0.001028 | 0.000949 | 0.073646 | 4 |
| 226 | 0.000433 | 0.107409 | 0.061018 | 0.001128 | 0.101444 | 1 |
| 227 | 0.000000 | 0.032548 | 0.000129 | 0.000329 | 0.088420 | 4 |
| 228 | 0.000000 | 0.013619 | 0.000000 | 0.000871 | 0.027673 | 4 |
| 229 | 0.000000 | 0.056713 | 0.023286 | 0.003271 | 0.085400 | 4 |
| 230 | 0.377151 | 0.013765 | 0.011495 | 0.780933 | 0.075347 | 3 |
| 231 | 0.000285 | 0.032479 | 0.002421 | 0.000978 | 0.089010 | 4 |
| 232 | 0.000000 | 0.022923 | 0.000000 | 0.002204 | 0.041433 | 4 |
| 233 | 0.000380 | 0.089590 | 0.000479 | 0.001744 | 0.024988 | 1 |
| 234 | 0.000746 | 0.055995 | 0.005225 | 0.001830 | 0.089152 | 4 |
| 235 | 0.002031 | 0.477967 | 0.003183 | 0.004740 | 0.098624 | 1 |
| 236 | 0.001156 | 0.034018 | 0.002170 | 0.002950 | 0.102235 | 4 |
| 237 | 0.003532 | 0.369929 | 0.004430 | 0.002043 | 0.153243 | 1 |
| 238 | 0.000101 | 0.056088 | 0.009421 | 0.003793 | 0.083575 | 4 |
| 239 | 0.000392 | 0.028414 | 0.002657 | 0.001288 | 0.060986 | 4 |
| 240 | 0.000000 | 0.014909 | 0.010533 | 0.000673 | 0.049405 | 4 |
| 241 | 0.000162 | 0.005730 | 0.000000 | 0.001065 | 0.020502 | 4 |
| 242 | 0.000165 | 0.026891 | 0.120812 | 0.001256 | 0.077801 | 2 |
| 243 | 0.000161 | 0.011087 | 0.005531 | 0.001163 | 0.049746 | 4 |
| 244 | 0.002047 | 0.536806 | 0.005112 | 0.003536 | 0.079855 | 1 |
| 245 | 0.000587 | 0.053695 | 0.004875 | 0.002085 | 0.081037 | 4 |
| 246 | 0.000000 | 0.010405 | 0.000072 | 0.000778 | 0.026615 | 4 |
| 247 | 0.000159 | 0.023888 | 0.006912 | 0.001567 | 0.084291 | 4 |
| 248 | 0.000440 | 0.045182 | 0.004596 | 0.002682 | 0.083644 | 4 |
| 249 | 0.000835 | 0.187912 | 0.001055 | 0.002908 | 0.046503 | 1 |
| 250 | 0.355502 | 0.003155 | 0.005682 | 0.203362 | 0.042841 | 0 |
| 251 | 0.000203 | 0.004435 | 0.008665 | 0.002345 | 0.033934 | 4 |
| 252 | 0.000076 | 0.037162 | 0.001402 | 0.001563 | 0.102785 | 4 |
| 253 | 0.001358 | 0.150339 | 0.028694 | 0.004457 | 0.295468 | 4 |
| 254 | 0.000592 | 0.039727 | 0.002856 | 0.005007 | 0.074195 | 4 |
| 255 | 0.000000 | 0.005168 | 0.000176 | 0.001986 | 0.024604 | 4 |
| 256 | 0.000000 | 0.016293 | 0.005498 | 0.000116 | 0.039213 | 4 |
| 257 | 0.000139 | 0.048121 | 0.001830 | 0.001825 | 0.089003 | 4 |
| 258 | 0.000200 | 0.014461 | 0.465881 | 0.000276 | 0.103392 | 2 |
| 259 | 0.000134 | 0.010530 | 0.117298 | 0.002720 | 0.071238 | 2 |
| 260 | 0.000149 | 0.018199 | 0.023292 | 0.000546 | 0.066104 | 4 |
| 261 | 0.001690 | 0.010371 | 0.007512 | 0.003610 | 0.123632 | 4 |
| 262 | 0.000290 | 0.030894 | 0.002222 | 0.001620 | 0.064465 | 4 |
| 263 | 0.001504 | 0.124335 | 0.003141 | 0.002651 | 0.138566 | 4 |
| 264 | 0.000457 | 0.023346 | 0.001040 | 0.003287 | 0.070109 | 4 |
| 265 | 0.002730 | 0.087854 | 0.012357 | 0.002818 | 0.097087 | 4 |
| 266 | 0.000000 | 0.002427 | 0.475783 | 0.000000 | 0.012407 | 2 |
| 267 | 0.000139 | 0.024329 | 0.001834 | 0.004743 | 0.063699 | 4 |
| 268 | 0.000000 | 0.020576 | 0.000219 | 0.000160 | 0.061139 | 4 |
| 269 | 0.001495 | 0.004991 | 0.009623 | 0.002761 | 0.085790 | 4 |
| 270 | 0.000000 | 0.005562 | 0.000000 | 0.000000 | 0.009793 | 4 |
| 271 | 0.001577 | 0.016398 | 0.008401 | 0.003824 | 0.105196 | 4 |
| 272 | 0.000000 | 0.004901 | 0.009533 | 0.000162 | 0.023315 | 4 |
| 273 | 0.000000 | 0.009643 | 0.002865 | 0.001046 | 0.053915 | 4 |
| 274 | 0.002072 | 0.334086 | 0.038762 | 0.004851 | 0.139458 | 1 |
| 275 | 0.001526 | 0.062048 | 0.002757 | 0.005744 | 0.129766 | 4 |
| 276 | 0.002223 | 0.057038 | 0.028197 | 0.003063 | 0.121258 | 4 |
| 277 | 0.000000 | 0.005282 | 0.000000 | 0.000094 | 0.015149 | 4 |
| 278 | 0.000000 | 0.006399 | 0.010730 | 0.000798 | 0.036490 | 4 |
| 279 | 0.000459 | 0.130289 | 0.002514 | 0.001142 | 0.088226 | 1 |
| 280 | 0.807135 | 0.000981 | 0.003623 | 0.150414 | 0.021197 | 0 |
| 281 | 0.000000 | 0.009531 | 0.028291 | 0.000689 | 0.039752 | 4 |
| 282 | 0.000548 | 0.017226 | 0.030438 | 0.001988 | 0.083133 | 4 |
| 283 | 0.000123 | 0.029992 | 0.041248 | 0.001968 | 0.111924 | 4 |
| 284 | 0.000190 | 0.017023 | 0.000314 | 0.001377 | 0.070348 | 4 |
| 285 | 0.000000 | 0.010996 | 0.000252 | 0.000913 | 0.023212 | 4 |
| 286 | 0.001749 | 0.262604 | 0.007342 | 0.004681 | 0.156970 | 1 |
| 287 | 0.000581 | 0.024179 | 0.018569 | 0.000506 | 0.073948 | 4 |
| 288 | 0.000266 | 0.011178 | 0.004067 | 0.000469 | 0.054036 | 4 |
| 289 | 0.000312 | 0.012441 | 0.002816 | 0.000530 | 0.036611 | 4 |
| 290 | 0.000000 | 0.012916 | 0.003721 | 0.002804 | 0.112982 | 4 |
| 291 | 0.003273 | 0.224576 | 0.014345 | 0.007957 | 0.213159 | 1 |
| 292 | 0.000079 | 0.013930 | 0.003365 | 0.004485 | 0.150043 | 4 |
| 293 | 0.000353 | 0.024373 | 0.008429 | 0.002719 | 0.106542 | 4 |
| 294 | 0.000000 | 0.031430 | 0.001742 | 0.001407 | 0.052123 | 4 |
| 295 | 0.000182 | 0.007928 | 0.000611 | 0.000824 | 0.022281 | 4 |
| 296 | 0.000768 | 0.038761 | 0.005716 | 0.001365 | 0.095968 | 4 |
| 297 | 0.000124 | 0.032947 | 0.052236 | 0.001839 | 0.099588 | 4 |
| 298 | 0.000193 | 0.090743 | 0.001452 | 0.001723 | 0.064454 | 1 |
| 299 | 0.000402 | 0.026329 | 0.000227 | 0.001479 | 0.059970 | 4 |
| 300 | 0.002575 | 0.133713 | 0.006994 | 0.003155 | 0.132906 | 1 |
| 301 | 0.000184 | 0.021967 | 0.002926 | 0.004480 | 0.139566 | 4 |
| 302 | 0.000000 | 0.001652 | 0.018435 | 0.000000 | 0.030977 | 4 |
| 303 | 0.001260 | 0.006590 | 0.009053 | 0.002215 | 0.056811 | 4 |
| 304 | 0.082425 | 0.014942 | 0.005387 | 0.393692 | 0.050311 | 3 |
| 305 | 0.000259 | 0.006287 | 0.000000 | 0.000000 | 0.012923 | 4 |
| 306 | 0.000146 | 0.029654 | 0.001917 | 0.002112 | 0.063636 | 4 |
| 307 | 0.000000 | 0.037403 | 0.044593 | 0.003288 | 0.130781 | 4 |
| 308 | 0.000150 | 0.024027 | 0.000541 | 0.000555 | 0.051047 | 4 |
| 309 | 0.001424 | 0.157789 | 0.006210 | 0.004866 | 0.236984 | 4 |
| 310 | 0.001187 | 0.072405 | 0.005789 | 0.009052 | 0.096611 | 4 |
| 311 | 0.001491 | 0.079585 | 0.010080 | 0.004498 | 0.161873 | 4 |
| 312 | 0.000958 | 0.319821 | 0.007439 | 0.003924 | 0.115958 | 1 |
| 313 | 0.000046 | 0.012809 | 0.003235 | 0.000000 | 0.032742 | 4 |
| 314 | 0.000086 | 0.013763 | 0.002711 | 0.000696 | 0.061418 | 4 |
| 315 | 0.001507 | 0.079232 | 0.003726 | 0.003857 | 0.213023 | 4 |
| 316 | 0.000524 | 0.058592 | 0.005276 | 0.004024 | 0.214340 | 4 |
| 317 | 0.000445 | 0.027317 | 0.283372 | 0.001949 | 0.075204 | 2 |
| 318 | 0.000420 | 0.016155 | 0.000378 | 0.000818 | 0.036270 | 4 |
| 319 | 0.001149 | 0.014213 | 0.000937 | 0.001822 | 0.039429 | 4 |
| 320 | 0.000613 | 0.018616 | 0.118074 | 0.001677 | 0.127602 | 4 |
| 321 | 0.000139 | 0.068643 | 0.003951 | 0.004313 | 0.154798 | 4 |
| 322 | 0.002494 | 0.032396 | 0.000066 | 0.001349 | 0.058588 | 4 |
| 323 | 0.000000 | 0.005268 | 0.408529 | 0.000431 | 0.022030 | 2 |
| 324 | 0.000000 | 0.004428 | 0.235632 | 0.000000 | 0.027904 | 2 |
| 325 | 0.000434 | 0.015498 | 0.016171 | 0.001169 | 0.076437 | 4 |
| 326 | 0.000000 | 0.020059 | 0.001504 | 0.001465 | 0.072045 | 4 |
| 327 | 0.068524 | 0.034629 | 0.006728 | 0.324347 | 0.135736 | 3 |
| 328 | 0.000766 | 0.208049 | 0.044704 | 0.002940 | 0.051580 | 1 |
| 329 | 0.001304 | 0.077861 | 0.014920 | 0.003627 | 0.163999 | 4 |
| 330 | 0.001669 | 0.011840 | 0.002083 | 0.003970 | 0.047391 | 4 |
| 331 | 0.000000 | 0.020785 | 0.001613 | 0.000341 | 0.026689 | 4 |
| 332 | 0.000398 | 0.024475 | 0.000893 | 0.001308 | 0.043595 | 4 |
| 333 | 0.000000 | 0.004563 | 0.000855 | 0.000188 | 0.021290 | 4 |
| 334 | 0.816564 | 0.000603 | 0.002257 | 0.116381 | 0.016956 | 0 |
| 335 | 0.000322 | 0.037311 | 0.003539 | 0.002127 | 0.114770 | 4 |
| 336 | 0.001416 | 0.074680 | 0.002604 | 0.001930 | 0.193177 | 4 |
| 337 | 0.000945 | 0.026457 | 0.003887 | 0.009295 | 0.161327 | 4 |
| 338 | 0.000720 | 0.177266 | 0.001184 | 0.002710 | 0.071819 | 1 |
| 339 | 0.000172 | 0.027394 | 0.007462 | 0.000837 | 0.085033 | 4 |
| 340 | 0.002295 | 0.016469 | 0.011961 | 0.006539 | 0.182986 | 4 |
| 341 | 0.693859 | 0.021491 | 0.008257 | 0.521654 | 0.055474 | 0 |
| 342 | 0.008269 | 0.054111 | 0.062210 | 0.025753 | 0.151995 | 4 |
| 343 | 0.000000 | 0.015121 | 0.002666 | 0.002183 | 0.040288 | 4 |
| 344 | 0.000190 | 0.069108 | 0.006604 | 0.004421 | 0.237846 | 4 |
| 345 | 0.001788 | 0.079878 | 0.001375 | 0.001785 | 0.100821 | 4 |
| 346 | 0.000362 | 0.103641 | 0.006004 | 0.001731 | 0.106969 | 4 |
| 347 | 0.000000 | 0.013434 | 0.000582 | 0.001675 | 0.058454 | 4 |
| 348 | 0.000546 | 0.103968 | 0.010567 | 0.002489 | 0.084596 | 1 |
| 349 | 0.000877 | 0.254285 | 0.002332 | 0.003594 | 0.110058 | 1 |
| 350 | 0.000106 | 0.028565 | 0.002551 | 0.001824 | 0.072748 | 4 |
| 351 | 0.007477 | 0.096944 | 0.005181 | 0.036201 | 0.132699 | 4 |
| 352 | 0.000378 | 0.019370 | 0.001975 | 0.000677 | 0.063102 | 4 |
| 353 | 0.002113 | 0.164917 | 0.003956 | 0.003876 | 0.088387 | 1 |
| 354 | 0.000000 | 0.003035 | 0.007274 | 0.000307 | 0.026942 | 4 |
| 355 | 0.000563 | 0.068977 | 0.003260 | 0.004721 | 0.105478 | 4 |
| 356 | 0.000000 | 0.042839 | 0.001481 | 0.000000 | 0.089489 | 4 |
| 357 | 0.000567 | 0.018575 | 0.000206 | 0.001923 | 0.043850 | 4 |
| 358 | 0.000408 | 0.096264 | 0.079233 | 0.002463 | 0.077837 | 1 |
| 359 | 0.005200 | 0.314168 | 0.018370 | 0.019912 | 0.298705 | 1 |
| 360 | 0.000000 | 0.002489 | 0.006125 | 0.000000 | 0.021713 | 4 |
| 361 | 0.000452 | 0.024288 | 0.007114 | 0.000468 | 0.055201 | 4 |
| 362 | 0.000000 | 0.002300 | 0.002190 | 0.000157 | 0.016598 | 4 |
| 363 | 0.000000 | 0.002841 | 0.013908 | 0.000000 | 0.019745 | 4 |
| 364 | 0.000053 | 0.025635 | 0.001069 | 0.001011 | 0.080301 | 4 |
| 365 | 0.002356 | 0.009771 | 0.001241 | 0.004989 | 0.040218 | 4 |
| 366 | 0.001167 | 0.003630 | 0.000684 | 0.000000 | 0.018017 | 4 |
| 367 | 0.000324 | 0.024346 | 0.007148 | 0.000849 | 0.091974 | 4 |
| 368 | 0.000871 | 0.041393 | 0.004987 | 0.002233 | 0.081428 | 4 |
| 369 | 0.000000 | 0.023304 | 0.000257 | 0.001018 | 0.048767 | 4 |
| 370 | 0.000618 | 0.076033 | 0.004801 | 0.002095 | 0.162584 | 4 |
| 371 | 0.000374 | 0.146848 | 0.001341 | 0.000385 | 0.052414 | 1 |
| 372 | 0.000704 | 0.191380 | 0.004678 | 0.001737 | 0.114123 | 1 |
| 373 | 0.000000 | 0.008962 | 0.037153 | 0.002201 | 0.045648 | 4 |
| 374 | 0.004535 | 0.298862 | 0.007434 | 0.003670 | 0.123833 | 1 |
| 375 | 0.001159 | 0.030370 | 0.009792 | 0.002173 | 0.093192 | 4 |
| 376 | 0.000279 | 0.009371 | 0.003784 | 0.001224 | 0.032100 | 4 |
| 377 | 0.000176 | 0.016922 | 0.002375 | 0.000907 | 0.028273 | 4 |
| 378 | 0.000000 | 0.005210 | 0.465867 | 0.000092 | 0.027823 | 2 |
| 379 | 0.000000 | 0.003988 | 0.000000 | 0.000000 | 0.018440 | 4 |
| 380 | 0.000000 | 0.014800 | 0.004102 | 0.004661 | 0.143027 | 4 |
| 381 | 0.001507 | 0.020698 | 0.049758 | 0.002084 | 0.097614 | 4 |
| 382 | 0.000846 | 0.123085 | 0.004955 | 0.001795 | 0.064098 | 1 |
| 383 | 0.049247 | 0.032086 | 0.007055 | 0.234343 | 0.087647 | 3 |
| 384 | 0.000277 | 0.022289 | 0.002375 | 0.000711 | 0.077738 | 4 |
| 385 | 0.000276 | 0.015477 | 0.000107 | 0.000682 | 0.035435 | 4 |
| 386 | 0.000986 | 0.065081 | 0.002172 | 0.001862 | 0.126448 | 4 |
| 387 | 0.000199 | 0.021732 | 0.002996 | 0.000638 | 0.071339 | 4 |
| 388 | 0.001143 | 0.012384 | 0.000911 | 0.001125 | 0.052883 | 4 |
| 389 | 0.000000 | 0.011889 | 0.000100 | 0.000510 | 0.033018 | 4 |
| 390 | 0.013548 | 0.238279 | 0.074289 | 0.028419 | 0.471492 | 4 |
| 391 | 0.000000 | 0.020685 | 0.000580 | 0.002198 | 0.045902 | 4 |
| 392 | 0.484233 | 0.006619 | 0.005437 | 0.421114 | 0.032354 | 0 |
| 393 | 0.000000 | 0.017866 | 0.000452 | 0.002731 | 0.044287 | 4 |
| 394 | 0.000751 | 0.034009 | 0.005420 | 0.000521 | 0.073559 | 4 |
| 395 | 0.000295 | 0.004750 | 0.001560 | 0.000000 | 0.035677 | 4 |
| 396 | 0.081262 | 0.054900 | 0.016950 | 0.306230 | 0.142086 | 3 |
| 397 | 0.172694 | 0.012169 | 0.007445 | 0.526933 | 0.045354 | 3 |
| 398 | 0.416845 | 0.020315 | 0.463219 | 0.373397 | 0.084132 | 2 |
| 399 | 0.001104 | 0.227639 | 0.001937 | 0.002215 | 0.088335 | 1 |
| 400 | 0.001462 | 0.099710 | 0.006606 | 0.003321 | 0.145921 | 4 |
| 401 | 0.004637 | 0.020782 | 0.017830 | 0.023385 | 0.086544 | 4 |
| 402 | 0.000444 | 0.008367 | 0.000000 | 0.000161 | 0.027675 | 4 |
| 403 | 0.001235 | 0.067957 | 0.007168 | 0.001882 | 0.168599 | 4 |
| 404 | 0.002355 | 0.536130 | 0.004909 | 0.003829 | 0.135239 | 1 |
| 405 | 0.000307 | 0.019367 | 0.001559 | 0.000868 | 0.058590 | 4 |
| 406 | 0.000907 | 0.047659 | 0.002789 | 0.002843 | 0.126735 | 4 |
| 407 | 0.000000 | 0.026624 | 0.001415 | 0.000401 | 0.050789 | 4 |
| 408 | 0.000454 | 0.062243 | 0.002367 | 0.002316 | 0.055851 | 1 |
| 409 | 0.252564 | 0.021616 | 0.007399 | 0.436992 | 0.061288 | 3 |
| 410 | 0.001254 | 0.211623 | 0.002878 | 0.002701 | 0.058965 | 1 |
| 411 | 0.000776 | 0.082606 | 0.002730 | 0.000872 | 0.066471 | 1 |
| 412 | 0.000173 | 0.047274 | 0.003074 | 0.002141 | 0.122193 | 4 |
| 413 | 0.205511 | 0.015221 | 0.012043 | 0.930502 | 0.060053 | 3 |
| 414 | 0.000000 | 0.021478 | 0.003540 | 0.000559 | 0.051047 | 4 |
| 415 | 0.936323 | 0.004347 | 0.005806 | 0.237430 | 0.029086 | 0 |
| 416 | 0.044287 | 0.002060 | 0.001510 | 0.123593 | 0.137781 | 4 |
| 417 | 0.079592 | 0.043684 | 0.006777 | 0.378486 | 0.141157 | 3 |
| 418 | 0.001978 | 0.271023 | 0.001674 | 0.004172 | 0.080238 | 1 |
| 419 | 0.003972 | 0.218719 | 0.015287 | 0.016950 | 0.222047 | 4 |
| 420 | 0.000000 | 0.009984 | 0.627046 | 0.002248 | 0.080158 | 2 |
| 421 | 0.003558 | 0.057738 | 0.003649 | 0.017755 | 0.109924 | 4 |
| 422 | 0.000213 | 0.029690 | 0.000964 | 0.001531 | 0.047274 | 4 |
| 423 | 0.000840 | 0.002668 | 0.000193 | 0.000753 | 0.020058 | 4 |
| 424 | 0.002607 | 0.200836 | 0.003011 | 0.007299 | 0.079106 | 1 |
| 425 | 0.000122 | 0.032942 | 0.005185 | 0.002918 | 0.081826 | 4 |
| 426 | 0.466802 | 0.010596 | 0.012498 | 0.858529 | 0.055751 | 3 |
| 427 | 0.003863 | 0.115646 | 0.006678 | 0.001611 | 0.099827 | 1 |
| 428 | 0.000052 | 0.009648 | 0.114055 | 0.000738 | 0.061810 | 2 |
| 429 | 0.000092 | 0.005945 | 0.039185 | 0.002214 | 0.044909 | 4 |
| 430 | 0.000841 | 0.065961 | 0.005487 | 0.004496 | 0.086653 | 4 |
| 431 | 0.000476 | 0.018950 | 0.053794 | 0.003447 | 0.099065 | 4 |
| 432 | 0.401055 | 0.034233 | 0.010299 | 0.446318 | 0.073645 | 3 |
| 433 | 0.000912 | 0.016357 | 0.000483 | 0.001941 | 0.067666 | 4 |
| 434 | 0.000000 | 0.005561 | 0.648652 | 0.000000 | 0.027603 | 2 |
| 435 | 0.000000 | 0.012923 | 0.003203 | 0.003811 | 0.148147 | 4 |
| 436 | 0.000219 | 0.009018 | 0.000413 | 0.001046 | 0.045239 | 4 |
| 437 | 0.002974 | 0.190609 | 0.008425 | 0.010038 | 0.217245 | 4 |
| 438 | 0.156505 | 0.022821 | 0.010437 | 0.687604 | 0.081281 | 3 |
| 439 | 0.003288 | 0.254017 | 0.003623 | 0.006446 | 0.160938 | 1 |
| 440 | 0.011873 | 0.194224 | 0.005843 | 0.029300 | 0.228625 | 4 |
| 441 | 0.000000 | 0.009147 | 0.000636 | 0.000396 | 0.028424 | 4 |
| 442 | 0.000309 | 0.009172 | 0.000257 | 0.000577 | 0.032847 | 4 |
| 443 | 0.334263 | 0.012341 | 0.013062 | 0.904047 | 0.058798 | 3 |
| 444 | 0.000000 | 0.004458 | 0.007575 | 0.000215 | 0.020569 | 4 |
| 445 | 0.001126 | 0.083009 | 0.023257 | 0.002356 | 0.119672 | 4 |
| 446 | 0.002178 | 0.415410 | 0.004814 | 0.008275 | 0.148634 | 1 |
| 447 | 0.000000 | 0.010003 | 0.000278 | 0.001458 | 0.032368 | 4 |
| 448 | 0.165975 | 0.005010 | 0.002423 | 0.067752 | 0.024150 | 0 |
| 449 | 0.001794 | 0.137688 | 0.158076 | 0.006500 | 0.307405 | 4 |
| 450 | 0.000000 | 0.010690 | 0.000633 | 0.000320 | 0.021330 | 4 |
| 451 | 0.000099 | 0.040340 | 0.005047 | 0.002118 | 0.148151 | 4 |
| 452 | 0.000338 | 0.037451 | 0.008879 | 0.000971 | 0.077360 | 4 |
| 453 | 0.000791 | 0.048048 | 0.004710 | 0.001827 | 0.121965 | 4 |
| 454 | 0.000202 | 0.028214 | 0.000879 | 0.002651 | 0.058392 | 4 |
| 455 | 0.001133 | 0.056559 | 0.111880 | 0.002844 | 0.194637 | 4 |
| 456 | 0.001185 | 0.253689 | 0.005700 | 0.004346 | 0.172273 | 1 |
| 457 | 0.000000 | 0.015971 | 0.002372 | 0.000966 | 0.068286 | 4 |
| 458 | 0.000000 | 0.002419 | 0.016409 | 0.000925 | 0.028189 | 4 |
| 459 | 0.000848 | 0.063801 | 0.002698 | 0.001646 | 0.111016 | 4 |
| 460 | 0.000151 | 0.007886 | 0.000184 | 0.002576 | 0.036564 | 4 |
| 461 | 0.031557 | 0.027876 | 0.539705 | 0.150217 | 0.078253 | 2 |
| 462 | 0.000000 | 0.013440 | 0.000842 | 0.003715 | 0.052180 | 4 |
| 463 | 0.000286 | 0.039834 | 0.000345 | 0.000385 | 0.086944 | 4 |
| 464 | 0.000000 | 0.016537 | 0.002102 | 0.000901 | 0.066690 | 4 |
| 465 | 0.778053 | 0.001270 | 0.004515 | 0.168223 | 0.023125 | 0 |
| 466 | 0.000361 | 0.014706 | 0.000229 | 0.000575 | 0.037613 | 4 |
| 467 | 0.000975 | 0.048271 | 0.121375 | 0.003800 | 0.214549 | 4 |
| 468 | 0.000000 | 0.004465 | 0.002627 | 0.000880 | 0.035961 | 4 |
| 469 | 0.000436 | 0.017107 | 0.000000 | 0.000230 | 0.043701 | 4 |
| 470 | 0.000000 | 0.009733 | 0.000120 | 0.000000 | 0.023170 | 4 |
| 471 | 0.000469 | 0.005262 | 0.000738 | 0.000000 | 0.024519 | 4 |
| 472 | 0.000285 | 0.054181 | 0.005387 | 0.001156 | 0.088256 | 4 |
| 473 | 0.000000 | 0.010546 | 0.009513 | 0.000188 | 0.040429 | 4 |
| 474 | 0.161330 | 0.025162 | 0.009998 | 0.757024 | 0.060875 | 3 |
| 475 | 0.000336 | 0.015693 | 0.010632 | 0.001943 | 0.070267 | 4 |
| 476 | 0.785039 | 0.007306 | 0.004116 | 0.278830 | 0.035423 | 0 |
| 477 | 0.000882 | 0.069936 | 0.004208 | 0.002117 | 0.148763 | 4 |
| 478 | 0.001038 | 0.092190 | 0.001235 | 0.001042 | 0.068293 | 1 |
| 479 | 0.000000 | 0.010477 | 0.008734 | 0.001065 | 0.035130 | 4 |
| 480 | 0.000089 | 0.024293 | 0.001781 | 0.001136 | 0.055952 | 4 |
| 481 | 0.000666 | 0.040690 | 0.004353 | 0.002778 | 0.102621 | 4 |
| 482 | 0.092683 | 0.002510 | 0.002685 | 0.200702 | 0.143951 | 3 |
| 483 | 0.000320 | 0.022108 | 0.013144 | 0.001056 | 0.074936 | 4 |
| 484 | 0.000000 | 0.004559 | 0.083267 | 0.000171 | 0.032332 | 2 |
| 485 | 0.001093 | 0.017758 | 0.040441 | 0.002542 | 0.084403 | 4 |
| 486 | 0.000121 | 0.010721 | 0.165174 | 0.001761 | 0.064536 | 2 |
| 487 | 0.000000 | 0.037852 | 0.000000 | 0.000994 | 0.106441 | 4 |
| 488 | 0.002106 | 0.067617 | 0.124424 | 0.002635 | 0.090722 | 2 |
| 489 | 0.000027 | 0.006807 | 0.557439 | 0.000281 | 0.052448 | 2 |
| 490 | 0.000415 | 0.004835 | 0.136298 | 0.003176 | 0.060643 | 2 |
| 491 | 0.000660 | 0.074854 | 0.004424 | 0.003312 | 0.236258 | 4 |
| 492 | 0.000833 | 0.002285 | 0.000470 | 0.000239 | 0.016554 | 4 |
| 493 | 0.001271 | 0.106100 | 0.006532 | 0.002883 | 0.149110 | 4 |
| 494 | 0.001616 | 0.100748 | 0.039353 | 0.004914 | 0.154836 | 4 |
| 495 | 0.000289 | 0.011199 | 0.307935 | 0.002305 | 0.072018 | 2 |
| 496 | 0.000000 | 0.010085 | 0.001313 | 0.000615 | 0.041026 | 4 |
| 497 | 0.001346 | 0.058858 | 0.005249 | 0.006331 | 0.092029 | 4 |
| 498 | 0.000406 | 0.002532 | 0.000000 | 0.002863 | 0.021462 | 4 |
| 499 | 0.000250 | 0.040315 | 0.001697 | 0.001041 | 0.092778 | 4 |
| 500 | 0.000404 | 0.021998 | 0.001946 | 0.000347 | 0.057230 | 4 |
| 501 | 0.001393 | 0.271391 | 0.004310 | 0.003909 | 0.153556 | 1 |
| 502 | 0.000491 | 0.078488 | 0.000678 | 0.002504 | 0.042463 | 1 |
| 503 | 0.000463 | 0.028328 | 0.002139 | 0.001752 | 0.093333 | 4 |
| 504 | 0.000072 | 0.045867 | 0.036004 | 0.003525 | 0.150339 | 4 |
| 505 | 0.000286 | 0.038866 | 0.001318 | 0.001296 | 0.055131 | 4 |
| 506 | 0.000326 | 0.117741 | 0.000821 | 0.002119 | 0.076899 | 1 |
| 507 | 0.000000 | 0.003007 | 0.007927 | 0.000450 | 0.023974 | 4 |
| 508 | 0.002561 | 0.181727 | 0.009254 | 0.004180 | 0.201746 | 4 |
| 509 | 0.002298 | 0.149950 | 0.012030 | 0.005414 | 0.212860 | 4 |
| 510 | 0.000959 | 0.220575 | 0.001653 | 0.003252 | 0.055312 | 1 |
| 511 | 0.001798 | 0.391647 | 0.003849 | 0.002112 | 0.123874 | 1 |
| 512 | 0.002349 | 0.039968 | 0.002246 | 0.002353 | 0.084295 | 4 |
| 513 | 0.001588 | 0.125364 | 0.002519 | 0.006297 | 0.092316 | 1 |
| 514 | 0.046661 | 0.001750 | 0.002489 | 0.141322 | 0.096025 | 3 |
| 515 | 0.001047 | 0.034997 | 0.127596 | 0.001203 | 0.049645 | 2 |
| 516 | 0.622912 | 0.006832 | 0.001475 | 0.060335 | 0.020622 | 0 |
| 517 | 0.000222 | 0.018864 | 0.000788 | 0.000920 | 0.053374 | 4 |
| 518 | 0.000380 | 0.034564 | 0.001394 | 0.001768 | 0.050429 | 4 |
| 519 | 0.000686 | 0.110559 | 0.001800 | 0.002859 | 0.141357 | 4 |
| 520 | 0.001220 | 0.169516 | 0.004667 | 0.002349 | 0.113146 | 1 |
| 521 | 0.000392 | 0.025310 | 0.002564 | 0.003125 | 0.080535 | 4 |
| 522 | 0.000000 | 0.001651 | 0.006666 | 0.000000 | 0.018424 | 4 |
| 523 | 0.001399 | 0.012064 | 0.008183 | 0.001203 | 0.053295 | 4 |
| 524 | 0.000000 | 0.012902 | 0.000153 | 0.000620 | 0.039949 | 4 |
| 525 | 0.001603 | 0.013207 | 0.008019 | 0.001334 | 0.058495 | 4 |
| 526 | 0.002215 | 0.018833 | 0.012942 | 0.000782 | 0.069647 | 4 |
| 527 | 0.004272 | 0.311171 | 0.048664 | 0.016366 | 0.233272 | 1 |
| 528 | 0.001355 | 0.022211 | 0.021433 | 0.002103 | 0.092046 | 4 |
| 529 | 0.001554 | 0.063898 | 0.083877 | 0.004139 | 0.139763 | 4 |
| 530 | 0.000680 | 0.107038 | 0.004014 | 0.002043 | 0.129171 | 4 |
| 531 | 0.002176 | 0.109401 | 0.024334 | 0.006199 | 0.248728 | 4 |
| 532 | 0.003037 | 0.095247 | 0.015392 | 0.005320 | 0.099181 | 4 |
| 533 | 0.000855 | 0.013590 | 0.056396 | 0.002001 | 0.085495 | 4 |
| 534 | 0.000000 | 0.003972 | 0.016127 | 0.001107 | 0.026922 | 4 |
| 535 | 0.002816 | 0.023285 | 0.033137 | 0.004559 | 0.167228 | 4 |
| 536 | 0.001936 | 0.073364 | 0.005889 | 0.001984 | 0.067770 | 1 |
| 537 | 0.000000 | 0.025958 | 0.002609 | 0.000583 | 0.092074 | 4 |
| 538 | 0.000399 | 0.027572 | 0.006549 | 0.002815 | 0.062590 | 4 |
| 539 | 0.839720 | 0.001256 | 0.004265 | 0.162747 | 0.022326 | 0 |
| 540 | 0.000000 | 0.022347 | 0.000648 | 0.000498 | 0.044325 | 4 |
| 541 | 0.002276 | 0.409223 | 0.004368 | 0.005698 | 0.120359 | 1 |
| 542 | 0.000870 | 0.081501 | 0.003884 | 0.002988 | 0.138360 | 4 |
| 543 | 0.000794 | 0.011703 | 0.019707 | 0.001860 | 0.069221 | 4 |
| 544 | 0.000000 | 0.016997 | 0.000233 | 0.001356 | 0.043300 | 4 |
| 545 | 0.000916 | 0.118867 | 0.010979 | 0.006220 | 0.094754 | 1 |
| 546 | 0.000000 | 0.027992 | 0.002622 | 0.000762 | 0.066037 | 4 |
| 547 | 0.000000 | 0.023350 | 0.000900 | 0.000448 | 0.029279 | 4 |
| 548 | 0.000000 | 0.010333 | 0.000600 | 0.000316 | 0.051105 | 4 |
| 549 | 0.002383 | 0.330223 | 0.006189 | 0.003716 | 0.125356 | 1 |
| 550 | 0.022228 | 0.031712 | 0.002972 | 0.106195 | 0.056672 | 3 |
| 551 | 0.000112 | 0.030975 | 0.004062 | 0.002820 | 0.135283 | 4 |
| 552 | 0.000000 | 0.010883 | 0.000168 | 0.000558 | 0.012920 | 4 |
| 553 | 0.002185 | 0.154772 | 0.001473 | 0.002106 | 0.079140 | 1 |
| 554 | 0.003163 | 0.458904 | 0.007749 | 0.006438 | 0.110463 | 1 |
| 555 | 0.000421 | 0.028802 | 0.001973 | 0.001510 | 0.038357 | 4 |
| 556 | 0.000000 | 0.004434 | 0.004713 | 0.001018 | 0.020791 | 4 |
| 557 | 0.000298 | 0.016335 | 0.017457 | 0.002097 | 0.064627 | 4 |
| 558 | 0.002445 | 0.008836 | 0.014866 | 0.004804 | 0.138162 | 4 |
| 559 | 0.001056 | 0.068537 | 0.016478 | 0.004293 | 0.149326 | 4 |
| 560 | 0.056725 | 0.015844 | 0.007997 | 0.239514 | 0.085329 | 3 |
| 561 | 0.008278 | 0.247266 | 0.007665 | 0.035968 | 0.143457 | 1 |
| 562 | 0.000691 | 0.196626 | 0.004615 | 0.004595 | 0.184302 | 1 |
| 563 | 0.000650 | 0.009615 | 0.010582 | 0.003703 | 0.035988 | 4 |
| 564 | 0.000116 | 0.012018 | 0.002366 | 0.002325 | 0.029536 | 4 |
| 565 | 0.001960 | 0.150464 | 0.015832 | 0.003983 | 0.182453 | 4 |
| 566 | 0.001765 | 0.044906 | 0.004661 | 0.002623 | 0.084190 | 4 |
| 567 | 0.000000 | 0.017953 | 0.000474 | 0.000149 | 0.036700 | 4 |
| 568 | 0.002032 | 0.582524 | 0.003218 | 0.001377 | 0.070418 | 1 |
| 569 | 0.000603 | 0.007961 | 0.001428 | 0.004025 | 0.037801 | 4 |
| 570 | 0.001407 | 0.007437 | 0.003099 | 0.003106 | 0.050658 | 4 |
| 571 | 0.001047 | 0.112701 | 0.003616 | 0.002350 | 0.138383 | 4 |
| 572 | 0.000225 | 0.025453 | 0.001221 | 0.001105 | 0.069917 | 4 |
| 573 | 0.001502 | 0.142372 | 0.013018 | 0.006224 | 0.233945 | 4 |
| 574 | 0.000345 | 0.053557 | 0.001812 | 0.001596 | 0.064532 | 4 |
| 575 | 0.000000 | 0.014193 | 0.005186 | 0.001423 | 0.047665 | 4 |
| 576 | 0.000647 | 0.017208 | 0.001505 | 0.001510 | 0.038346 | 4 |
| 577 | 0.003103 | 0.083286 | 0.003334 | 0.002686 | 0.120887 | 4 |
| 578 | 0.000000 | 0.003834 | 0.004876 | 0.000000 | 0.033045 | 4 |
| 579 | 0.196020 | 0.012145 | 0.010950 | 0.821936 | 0.078271 | 3 |
| 580 | 0.002121 | 0.023279 | 0.015504 | 0.000952 | 0.085227 | 4 |
| 581 | 0.001133 | 0.188574 | 0.002325 | 0.002629 | 0.053498 | 1 |
| 582 | 0.000641 | 0.019191 | 0.002506 | 0.003532 | 0.067602 | 4 |
| 583 | 0.000194 | 0.032713 | 0.001350 | 0.000839 | 0.083948 | 4 |
| 584 | 0.000000 | 0.007327 | 0.003103 | 0.002382 | 0.019525 | 4 |
| 585 | 0.193479 | 0.015134 | 0.011975 | 0.907165 | 0.066477 | 3 |
| 586 | 0.000884 | 0.191126 | 0.004121 | 0.002585 | 0.175297 | 1 |
| 587 | 0.001943 | 0.015823 | 0.001358 | 0.000176 | 0.018274 | 4 |
| 588 | 0.000000 | 0.005611 | 0.001305 | 0.000286 | 0.026576 | 4 |
| 589 | 0.001595 | 0.183903 | 0.010179 | 0.005174 | 0.134280 | 1 |
| 590 | 0.000932 | 0.071373 | 0.068797 | 0.003605 | 0.194787 | 4 |
| 591 | 0.003105 | 0.691036 | 0.004907 | 0.006873 | 0.110337 | 1 |
| 592 | 0.000000 | 0.024197 | 0.001547 | 0.000659 | 0.068250 | 4 |
| 593 | 0.000196 | 0.018465 | 0.000054 | 0.000349 | 0.045722 | 4 |
| 594 | 0.017690 | 0.503285 | 0.060729 | 0.030352 | 0.416485 | 1 |
| 595 | 0.109926 | 0.008239 | 0.008113 | 0.480891 | 0.078245 | 3 |
| 596 | 0.028159 | 0.002055 | 0.001099 | 0.085383 | 0.098974 | 4 |
| 597 | 0.001610 | 0.158828 | 0.025340 | 0.007668 | 0.317382 | 4 |
| 598 | 0.002361 | 0.197917 | 0.008834 | 0.005331 | 0.152345 | 1 |
| 599 | 0.000804 | 0.061488 | 0.003977 | 0.005029 | 0.111990 | 4 |
| 600 | 0.004598 | 0.660312 | 0.007184 | 0.012589 | 0.137846 | 1 |
| 601 | 0.000000 | 0.005798 | 0.001992 | 0.000170 | 0.025048 | 4 |
| 602 | 0.122383 | 0.014146 | 0.007578 | 0.581749 | 0.061042 | 3 |
| 603 | 0.000697 | 0.037450 | 0.098192 | 0.003113 | 0.154527 | 4 |
| 604 | 0.000223 | 0.073696 | 0.003461 | 0.004038 | 0.120956 | 4 |
| 605 | 0.000188 | 0.031057 | 0.002287 | 0.001128 | 0.071644 | 4 |
| 606 | 0.002936 | 0.220632 | 0.048653 | 0.008240 | 0.433652 | 4 |
| 607 | 0.000117 | 0.006603 | 0.010537 | 0.000809 | 0.070498 | 4 |
| 608 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 609 | 0.000999 | 0.074621 | 0.004890 | 0.003419 | 0.110959 | 4 |
| 610 | 0.000116 | 0.036897 | 0.001294 | 0.002868 | 0.094314 | 4 |
| 611 | 0.000176 | 0.023198 | 0.000882 | 0.002098 | 0.038096 | 4 |
| 612 | 0.002910 | 0.025202 | 0.089537 | 0.000764 | 0.051072 | 2 |
| 613 | 0.000827 | 0.006133 | 0.055364 | 0.002216 | 0.083062 | 4 |
| 614 | 0.000000 | 0.005551 | 0.000104 | 0.001632 | 0.020349 | 4 |
| 615 | 0.199888 | 0.010789 | 0.006653 | 0.479586 | 0.043632 | 3 |
| 616 | 0.171299 | 0.011741 | 0.006969 | 0.479296 | 0.056979 | 3 |
| 617 | 0.000964 | 0.041538 | 0.018335 | 0.004620 | 0.119873 | 4 |
| 618 | 0.000727 | 0.028290 | 0.000164 | 0.000970 | 0.058901 | 4 |
| 619 | 0.001032 | 0.029547 | 0.002744 | 0.000709 | 0.061917 | 4 |
| 620 | 0.000715 | 0.189040 | 0.002371 | 0.002508 | 0.082701 | 1 |
| 621 | 0.000206 | 0.043884 | 0.003365 | 0.000364 | 0.040025 | 1 |
| 622 | 0.000000 | 0.013780 | 0.001054 | 0.000468 | 0.031654 | 4 |
| 623 | 0.000315 | 0.009983 | 0.000473 | 0.001047 | 0.029360 | 4 |
| 624 | 0.001870 | 0.054464 | 0.005605 | 0.001460 | 0.059416 | 4 |
| 625 | 0.000151 | 0.022917 | 0.007280 | 0.001129 | 0.061048 | 4 |
| 626 | 0.000199 | 0.009992 | 0.000557 | 0.001915 | 0.029341 | 4 |
| 627 | 0.205854 | 0.013160 | 0.012349 | 0.953852 | 0.059910 | 3 |
| 628 | 0.000729 | 0.009159 | 0.000974 | 0.000547 | 0.031710 | 4 |
| 629 | 0.000416 | 0.034471 | 0.000279 | 0.001378 | 0.044098 | 4 |
| 630 | 0.000119 | 0.024765 | 0.004524 | 0.000568 | 0.055217 | 4 |
| 631 | 0.000000 | 0.010699 | 0.008000 | 0.000360 | 0.036989 | 4 |
| 632 | 0.000035 | 0.010527 | 0.009978 | 0.001228 | 0.060716 | 4 |
| 633 | 0.000000 | 0.012079 | 0.031477 | 0.001154 | 0.055066 | 4 |
| 634 | 0.001692 | 0.143109 | 0.024732 | 0.007257 | 0.320981 | 4 |
| 635 | 0.000000 | 0.003931 | 0.018983 | 0.000860 | 0.031641 | 4 |
| 636 | 0.000427 | 0.005087 | 0.601851 | 0.000544 | 0.032085 | 2 |
| 637 | 0.001338 | 0.003721 | 0.000000 | 0.000000 | 0.007751 | 4 |
| 638 | 0.000392 | 0.022459 | 0.001496 | 0.003438 | 0.061681 | 4 |
| 639 | 0.000199 | 0.043218 | 0.002868 | 0.002812 | 0.168331 | 4 |
| 640 | 0.000000 | 0.038689 | 0.001464 | 0.000331 | 0.062491 | 4 |
| 641 | 0.000965 | 0.068438 | 0.004360 | 0.002744 | 0.177305 | 4 |
| 642 | 0.000000 | 0.015458 | 0.000128 | 0.000664 | 0.038874 | 4 |
| 643 | 0.001513 | 0.005630 | 0.003873 | 0.001935 | 0.041821 | 4 |
| 644 | 0.000978 | 0.015052 | 0.001620 | 0.004156 | 0.047857 | 4 |
| 645 | 0.000507 | 0.049794 | 0.001614 | 0.001527 | 0.109940 | 4 |
| 646 | 0.759422 | 0.007492 | 0.007966 | 0.434122 | 0.046729 | 0 |
| 647 | 0.000960 | 0.007341 | 0.000252 | 0.000501 | 0.030909 | 4 |
| 648 | 0.000000 | 0.018493 | 0.001778 | 0.000493 | 0.042750 | 4 |
| 649 | 0.098437 | 0.012616 | 0.005945 | 0.438385 | 0.043830 | 3 |
| 650 | 0.000090 | 0.020167 | 0.004902 | 0.004340 | 0.137935 | 4 |
| 651 | 0.003369 | 0.009401 | 0.012478 | 0.005125 | 0.141970 | 4 |
| 652 | 0.000000 | 0.010270 | 0.001080 | 0.000115 | 0.028460 | 4 |
| 653 | 0.001677 | 0.201498 | 0.009451 | 0.003772 | 0.177783 | 1 |
| 654 | 0.010976 | 0.738956 | 0.015090 | 0.039502 | 0.146464 | 1 |
| 655 | 0.000218 | 0.023591 | 0.000778 | 0.000000 | 0.029408 | 4 |
| 656 | 0.000331 | 0.020694 | 0.003319 | 0.001045 | 0.059295 | 4 |
| 657 | 0.000825 | 0.089346 | 0.031053 | 0.004930 | 0.198318 | 4 |
| 658 | 0.000189 | 0.006575 | 0.001465 | 0.002097 | 0.025076 | 4 |
| 659 | 0.000419 | 0.027857 | 0.006653 | 0.001230 | 0.072582 | 4 |
| 660 | 0.000000 | 0.010964 | 0.163197 | 0.000664 | 0.069294 | 2 |
| 661 | 0.000944 | 0.016430 | 0.001899 | 0.002938 | 0.056190 | 4 |
| 662 | 0.017796 | 0.179457 | 0.075749 | 0.077846 | 0.452836 | 4 |
| 663 | 0.000000 | 0.014943 | 0.000249 | 0.000730 | 0.022666 | 4 |
| 664 | 0.003933 | 0.008346 | 0.014395 | 0.005983 | 0.157580 | 4 |
| 665 | 0.001001 | 0.042872 | 0.002227 | 0.000747 | 0.076204 | 4 |
| 666 | 0.000155 | 0.028944 | 0.000703 | 0.000705 | 0.041421 | 4 |
| 667 | 0.000000 | 0.008264 | 0.076090 | 0.000000 | 0.021813 | 2 |
| 668 | 0.000780 | 0.041334 | 0.422552 | 0.004994 | 0.181351 | 2 |
| 669 | 0.000698 | 0.083922 | 0.006288 | 0.002502 | 0.108486 | 4 |
| 670 | 0.000546 | 0.036957 | 0.001778 | 0.001160 | 0.116964 | 4 |
| 671 | 0.002509 | 0.024825 | 0.015041 | 0.007393 | 0.144359 | 4 |
| 672 | 0.029367 | 0.001510 | 0.000350 | 0.073276 | 0.111175 | 4 |
| 673 | 0.000000 | 0.011822 | 0.000939 | 0.000319 | 0.034539 | 4 |
| 674 | 0.000994 | 0.028064 | 0.000831 | 0.000658 | 0.071497 | 4 |
| 675 | 0.000092 | 0.032277 | 0.001225 | 0.001279 | 0.095802 | 4 |
| 676 | 0.000973 | 0.005740 | 0.432261 | 0.001452 | 0.069616 | 2 |
| 677 | 0.001156 | 0.028917 | 0.000919 | 0.005371 | 0.071396 | 4 |
| 678 | 0.000365 | 0.007145 | 0.001236 | 0.001831 | 0.045361 | 4 |
| 679 | 0.000215 | 0.014334 | 0.458731 | 0.000000 | 0.090353 | 2 |
| 680 | 0.001657 | 0.003522 | 0.000132 | 0.000000 | 0.015268 | 4 |
| 681 | 0.000625 | 0.082186 | 0.008811 | 0.003941 | 0.254863 | 4 |
| 682 | 0.000596 | 0.005262 | 0.000158 | 0.000116 | 0.018824 | 4 |
| 683 | 0.001085 | 0.054745 | 0.005174 | 0.001517 | 0.093151 | 4 |
| 684 | 0.002038 | 0.227172 | 0.005977 | 0.005249 | 0.153632 | 1 |
| 685 | 0.000000 | 0.014208 | 0.001036 | 0.000116 | 0.050542 | 4 |
| 686 | 0.200041 | 0.011809 | 0.011733 | 0.921274 | 0.060847 | 3 |
| 687 | 0.000694 | 0.069816 | 0.000538 | 0.000779 | 0.074103 | 4 |
| 688 | 0.000441 | 0.008957 | 0.014400 | 0.004308 | 0.056175 | 4 |
| 689 | 0.000000 | 0.011923 | 0.000140 | 0.000517 | 0.028800 | 4 |
| 690 | 0.000229 | 0.050240 | 0.005243 | 0.003557 | 0.188542 | 4 |
| 691 | 0.002607 | 0.007583 | 0.010620 | 0.004128 | 0.118730 | 4 |
| 692 | 0.000298 | 0.091850 | 0.002262 | 0.003037 | 0.108809 | 4 |
| 693 | 0.000000 | 0.009427 | 0.297690 | 0.002736 | 0.119084 | 2 |
| 694 | 0.000000 | 0.015299 | 0.000000 | 0.000926 | 0.020945 | 4 |
| 695 | 0.000076 | 0.005930 | 0.026018 | 0.000000 | 0.035774 | 4 |
| 696 | 0.000457 | 0.011774 | 0.001646 | 0.002646 | 0.041612 | 4 |
| 697 | 0.001340 | 0.202989 | 0.010717 | 0.002762 | 0.088270 | 1 |
| 698 | 0.000132 | 0.013429 | 0.045428 | 0.000000 | 0.075106 | 4 |
| 699 | 0.000674 | 0.062797 | 0.004833 | 0.003912 | 0.132970 | 4 |
| 700 | 0.000598 | 0.058010 | 0.002409 | 0.002609 | 0.152519 | 4 |
| 701 | 0.001392 | 0.052592 | 0.002797 | 0.003229 | 0.060132 | 4 |
| 702 | 0.000601 | 0.093015 | 0.015698 | 0.002493 | 0.091694 | 1 |
| 703 | 0.000302 | 0.021543 | 0.001737 | 0.001541 | 0.031715 | 4 |
| 704 | 0.000697 | 0.011500 | 0.003431 | 0.001158 | 0.043738 | 4 |
| 705 | 0.000511 | 0.015125 | 0.011975 | 0.000585 | 0.049388 | 4 |
| 706 | 0.000000 | 0.017333 | 0.000772 | 0.000151 | 0.025840 | 4 |
| 707 | 0.030529 | 0.005771 | 0.001253 | 0.092700 | 0.022810 | 3 |
| 708 | 0.001112 | 0.010329 | 0.000680 | 0.005716 | 0.041682 | 4 |
| 709 | 0.000154 | 0.026880 | 0.001020 | 0.000463 | 0.041646 | 4 |
| 710 | 0.000314 | 0.022893 | 0.001059 | 0.000678 | 0.038273 | 4 |
| 711 | 0.000000 | 0.008514 | 0.017098 | 0.000075 | 0.031757 | 4 |
| 712 | 0.000334 | 0.011273 | 0.000763 | 0.003180 | 0.093509 | 4 |
| 713 | 0.001602 | 0.067087 | 0.006374 | 0.005310 | 0.151539 | 4 |
| 714 | 0.001225 | 0.041920 | 0.000266 | 0.001061 | 0.060149 | 4 |
| 715 | 0.783838 | 0.004884 | 0.007505 | 0.406907 | 0.040973 | 0 |
| 716 | 0.000000 | 0.007900 | 0.000000 | 0.000715 | 0.035670 | 4 |
| 717 | 0.002692 | 0.054781 | 0.009353 | 0.002458 | 0.159577 | 4 |
| 718 | 0.000000 | 0.016959 | 0.001234 | 0.000344 | 0.082443 | 4 |
| 719 | 0.001551 | 0.109164 | 0.011989 | 0.000710 | 0.079788 | 1 |
| 720 | 0.001644 | 0.066536 | 0.136104 | 0.004847 | 0.160411 | 4 |
| 721 | 0.001706 | 0.275512 | 0.002381 | 0.001429 | 0.095485 | 1 |
| 722 | 0.000476 | 0.011811 | 0.069096 | 0.001059 | 0.061552 | 2 |
| 723 | 0.002203 | 0.016789 | 0.022909 | 0.001748 | 0.081530 | 4 |
| 724 | 0.002323 | 0.007402 | 0.011503 | 0.005514 | 0.127606 | 4 |
| 725 | 0.000249 | 0.005402 | 0.000368 | 0.000387 | 0.023907 | 4 |
| 726 | 0.000000 | 0.020244 | 0.004398 | 0.001779 | 0.094460 | 4 |
| 727 | 0.000000 | 0.023923 | 0.003257 | 0.001322 | 0.043919 | 4 |
| 728 | 0.000113 | 0.025587 | 0.000854 | 0.000966 | 0.066886 | 4 |
| 729 | 0.000330 | 0.019026 | 0.323797 | 0.000243 | 0.113383 | 2 |
| 730 | 0.000622 | 0.022980 | 0.010714 | 0.000983 | 0.062542 | 4 |
| 731 | 0.000479 | 0.137951 | 0.002579 | 0.002157 | 0.092301 | 1 |
| 732 | 0.158916 | 0.014285 | 0.009807 | 0.755452 | 0.061267 | 3 |
| 733 | 0.001916 | 0.370307 | 0.005807 | 0.001841 | 0.091011 | 1 |
| 734 | 0.005113 | 0.587435 | 0.008178 | 0.005321 | 0.145833 | 1 |
| 735 | 0.000626 | 0.056931 | 0.005738 | 0.001256 | 0.127253 | 4 |
| 736 | 0.000283 | 0.030762 | 0.002394 | 0.001697 | 0.057384 | 4 |
| 737 | 0.000000 | 0.022238 | 0.002269 | 0.001618 | 0.048814 | 4 |
| 738 | 0.000000 | 0.021892 | 0.000965 | 0.000000 | 0.032496 | 4 |
| 739 | 0.000242 | 0.030136 | 0.017328 | 0.001295 | 0.095088 | 4 |
| 740 | 0.000690 | 0.198491 | 0.002874 | 0.003038 | 0.119415 | 1 |
| 741 | 0.000523 | 0.072551 | 0.005863 | 0.003554 | 0.234600 | 4 |
| 742 | 0.823321 | 0.001348 | 0.003635 | 0.147738 | 0.022047 | 0 |
| 743 | 0.000000 | 0.009019 | 0.002559 | 0.003182 | 0.114560 | 4 |
| 744 | 0.000974 | 0.013203 | 0.000000 | 0.001569 | 0.027836 | 4 |
| 745 | 0.001452 | 0.117198 | 0.007918 | 0.006116 | 0.196250 | 4 |
| 746 | 0.000595 | 0.004410 | 0.000000 | 0.000264 | 0.019257 | 4 |
| 747 | 0.000489 | 0.016073 | 0.019778 | 0.002675 | 0.055585 | 4 |
| 748 | 0.000498 | 0.012119 | 0.000843 | 0.000477 | 0.022623 | 4 |
| 749 | 0.000491 | 0.014894 | 0.002229 | 0.001369 | 0.066966 | 4 |
| 750 | 0.000231 | 0.022871 | 0.001099 | 0.000150 | 0.034683 | 4 |
| 751 | 0.001166 | 0.065551 | 0.007895 | 0.003586 | 0.175917 | 4 |
| 752 | 0.001420 | 0.076749 | 0.003066 | 0.003075 | 0.203376 | 4 |
| 753 | 0.002277 | 0.083569 | 0.028126 | 0.004593 | 0.216934 | 4 |
| 754 | 0.001838 | 0.140942 | 0.012486 | 0.005173 | 0.291133 | 4 |
| 755 | 0.000000 | 0.004346 | 0.004232 | 0.000000 | 0.018697 | 4 |
| 756 | 0.000000 | 0.008636 | 0.000000 | 0.000193 | 0.046112 | 4 |
| 757 | 0.000000 | 0.010517 | 0.002550 | 0.000256 | 0.034678 | 4 |
| 758 | 0.000469 | 0.003233 | 0.000421 | 0.001263 | 0.019713 | 4 |
| 759 | 0.000891 | 0.060065 | 0.010525 | 0.003129 | 0.088016 | 4 |
| 760 | 0.000000 | 0.003592 | 0.002914 | 0.000000 | 0.015217 | 4 |
| 761 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 762 | 0.001355 | 0.091458 | 0.001227 | 0.001114 | 0.059648 | 1 |
| 763 | 0.000180 | 0.146404 | 0.000776 | 0.000237 | 0.044163 | 1 |
| 764 | 0.002515 | 0.337904 | 0.034500 | 0.008238 | 0.381933 | 4 |
| 765 | 0.000383 | 0.018362 | 0.120160 | 0.000807 | 0.117437 | 2 |
| 766 | 0.000223 | 0.013713 | 0.015391 | 0.002370 | 0.045245 | 4 |
| 767 | 0.000600 | 0.020959 | 0.358910 | 0.001287 | 0.110665 | 2 |
| 768 | 0.001780 | 0.067626 | 0.004224 | 0.012431 | 0.100720 | 4 |
| 769 | 0.001251 | 0.232827 | 0.003331 | 0.003989 | 0.108143 | 1 |
| 770 | 0.000238 | 0.009121 | 0.001457 | 0.001033 | 0.016725 | 4 |
| 771 | 0.000473 | 0.111356 | 0.007689 | 0.004736 | 0.290775 | 4 |
| 772 | 0.000285 | 0.015164 | 0.015601 | 0.000212 | 0.044654 | 4 |
| 773 | 0.000139 | 0.010196 | 0.015392 | 0.001421 | 0.090773 | 4 |
| 774 | 0.000000 | 0.027804 | 0.004433 | 0.000498 | 0.028754 | 4 |
| 775 | 0.089858 | 0.018841 | 0.005162 | 0.316566 | 0.069671 | 3 |
| 776 | 0.000406 | 0.011867 | 0.001370 | 0.000000 | 0.043389 | 4 |
| 777 | 0.001255 | 0.142266 | 0.021296 | 0.007196 | 0.250229 | 4 |
| 778 | 0.000000 | 0.002143 | 0.496246 | 0.000000 | 0.018669 | 2 |
| 779 | 0.000000 | 0.018509 | 0.000647 | 0.001246 | 0.040523 | 4 |
| 780 | 0.000000 | 0.002942 | 0.014053 | 0.000000 | 0.019517 | 4 |
| 781 | 0.000000 | 0.005981 | 0.004378 | 0.001643 | 0.040218 | 4 |
| 782 | 0.000504 | 0.010301 | 0.000088 | 0.000480 | 0.034426 | 4 |
| 783 | 0.000000 | 0.009395 | 0.000486 | 0.000750 | 0.030896 | 4 |
| 784 | 0.000000 | 0.012836 | 0.007016 | 0.000835 | 0.044319 | 4 |
| 785 | 0.000505 | 0.055114 | 0.004892 | 0.001389 | 0.052919 | 1 |
| 786 | 0.005886 | 0.027094 | 0.001379 | 0.026862 | 0.066995 | 4 |
| 787 | 0.001358 | 0.018081 | 0.000509 | 0.000964 | 0.039378 | 4 |
| 788 | 0.001592 | 0.035705 | 0.001121 | 0.002369 | 0.088541 | 4 |
| 789 | 0.000000 | 0.016119 | 0.000000 | 0.001250 | 0.041678 | 4 |
| 790 | 0.001123 | 0.012358 | 0.008783 | 0.001102 | 0.066155 | 4 |
| 791 | 0.000000 | 0.016686 | 0.001709 | 0.000000 | 0.044270 | 4 |
| 792 | 0.000186 | 0.151088 | 0.004306 | 0.000127 | 0.050504 | 1 |
| 793 | 0.000527 | 0.080479 | 0.002856 | 0.000823 | 0.044044 | 1 |
| 794 | 0.000307 | 0.057543 | 0.033881 | 0.002077 | 0.157754 | 4 |
| 795 | 0.001030 | 0.018086 | 0.001032 | 0.001634 | 0.041298 | 4 |
| 796 | 0.010306 | 0.000428 | 0.000000 | 0.020776 | 0.076556 | 4 |
| 797 | 0.000375 | 0.058939 | 0.022322 | 0.003078 | 0.151822 | 4 |
| 798 | 0.533387 | 0.008800 | 0.003299 | 0.126994 | 0.032726 | 0 |
| 799 | 0.000000 | 0.012673 | 0.000274 | 0.000147 | 0.033610 | 4 |
| 800 | 0.002024 | 0.242416 | 0.045574 | 0.000951 | 0.074113 | 1 |
| 801 | 0.000000 | 0.009643 | 0.002865 | 0.001046 | 0.053915 | 4 |
| 802 | 0.000798 | 0.103944 | 0.064439 | 0.004339 | 0.163976 | 4 |
| 803 | 0.002308 | 0.332241 | 0.008499 | 0.007203 | 0.263142 | 1 |
| 804 | 0.000204 | 0.015892 | 0.478739 | 0.000627 | 0.110671 | 2 |
| 805 | 0.003955 | 0.010308 | 0.014474 | 0.006257 | 0.155149 | 4 |
| 806 | 0.002739 | 0.279165 | 0.004048 | 0.003064 | 0.069707 | 1 |
| 807 | 0.000918 | 0.008219 | 0.000924 | 0.001791 | 0.023460 | 4 |
| 808 | 0.000184 | 0.022637 | 0.001421 | 0.000747 | 0.061579 | 4 |
| 809 | 0.000464 | 0.019229 | 0.001982 | 0.000310 | 0.049635 | 4 |
| 810 | 0.000723 | 0.039323 | 0.001894 | 0.001750 | 0.097433 | 4 |
| 811 | 0.002471 | 0.077094 | 0.006393 | 0.001598 | 0.075933 | 1 |
| 812 | 0.000293 | 0.017920 | 0.024533 | 0.001035 | 0.067480 | 4 |
| 813 | 0.000000 | 0.032943 | 0.032013 | 0.001540 | 0.090070 | 4 |
| 814 | 0.000089 | 0.007355 | 0.001287 | 0.002878 | 0.049560 | 4 |
| 815 | 0.000325 | 0.032762 | 0.003191 | 0.001462 | 0.106822 | 4 |
| 816 | 0.001355 | 0.100517 | 0.002054 | 0.002721 | 0.079383 | 1 |
| 817 | 0.001845 | 0.007906 | 0.007765 | 0.001980 | 0.080449 | 4 |
| 818 | 0.001247 | 0.016837 | 0.003719 | 0.002687 | 0.068009 | 4 |
| 819 | 0.527170 | 0.004056 | 0.000907 | 0.062944 | 0.020350 | 0 |
| 820 | 0.000000 | 0.009379 | 0.000000 | 0.002586 | 0.016376 | 4 |
| 821 | 0.002803 | 0.117525 | 0.017545 | 0.004154 | 0.276059 | 4 |
| 822 | 0.001017 | 0.030100 | 0.006746 | 0.000804 | 0.096596 | 4 |
| 823 | 0.002540 | 0.144124 | 0.002903 | 0.004041 | 0.117654 | 1 |
| 824 | 0.000000 | 0.015614 | 0.002243 | 0.002087 | 0.088426 | 4 |
| 825 | 0.179687 | 0.011337 | 0.010510 | 0.743507 | 0.069274 | 3 |
| 826 | 0.000077 | 0.040389 | 0.000887 | 0.001711 | 0.061448 | 4 |
| 827 | 0.620233 | 0.005201 | 0.008748 | 0.454368 | 0.035684 | 0 |
| 828 | 0.000963 | 0.045190 | 0.038148 | 0.004064 | 0.151322 | 4 |
| 829 | 0.000253 | 0.004806 | 0.000448 | 0.000021 | 0.025545 | 4 |
| 830 | 0.000677 | 0.037784 | 0.017929 | 0.000954 | 0.122709 | 4 |
| 831 | 0.188884 | 0.010947 | 0.010737 | 0.850739 | 0.103834 | 3 |
| 832 | 0.002898 | 0.228840 | 0.031961 | 0.009832 | 0.474249 | 4 |
| 833 | 0.000435 | 0.003792 | 0.517372 | 0.000553 | 0.027848 | 2 |
| 834 | 0.000000 | 0.006510 | 0.000000 | 0.000000 | 0.025172 | 4 |
| 835 | 0.001613 | 0.205686 | 0.029084 | 0.006401 | 0.280471 | 4 |
| 836 | 0.000064 | 0.007476 | 0.167612 | 0.001029 | 0.037860 | 2 |
| 837 | 0.043161 | 0.015251 | 0.003110 | 0.181084 | 0.057411 | 3 |
| 838 | 0.000231 | 0.014860 | 0.000279 | 0.000292 | 0.054544 | 4 |
| 839 | 0.000460 | 0.067808 | 0.011857 | 0.001893 | 0.094143 | 4 |
| 840 | 0.000000 | 0.045175 | 0.001461 | 0.002205 | 0.061587 | 4 |
| 841 | 0.000482 | 0.001934 | 0.000577 | 0.000000 | 0.017030 | 4 |
| 842 | 0.000000 | 0.009466 | 0.001101 | 0.000212 | 0.024290 | 4 |
| 843 | 0.000346 | 0.008618 | 0.087776 | 0.000585 | 0.041082 | 2 |
| 844 | 0.000790 | 0.017718 | 0.064301 | 0.004330 | 0.116770 | 4 |
| 845 | 0.000000 | 0.018451 | 0.000592 | 0.000462 | 0.033181 | 4 |
| 846 | 0.000645 | 0.004935 | 0.000141 | 0.000419 | 0.016085 | 4 |
| 847 | 0.000000 | 0.005300 | 0.000967 | 0.000224 | 0.019044 | 4 |
| 848 | 0.001990 | 0.362539 | 0.035781 | 0.004216 | 0.190722 | 1 |
| 849 | 0.000181 | 0.007140 | 0.429441 | 0.001628 | 0.063088 | 2 |
| 850 | 0.000000 | 0.017320 | 0.000334 | 0.000343 | 0.040007 | 4 |
| 851 | 0.000727 | 0.018355 | 0.015125 | 0.002897 | 0.079882 | 4 |
| 852 | 0.001089 | 0.175029 | 0.002722 | 0.002089 | 0.082715 | 1 |
| 853 | 0.000694 | 0.018693 | 0.000094 | 0.000295 | 0.058206 | 4 |
| 854 | 0.001011 | 0.021687 | 0.002058 | 0.002857 | 0.071068 | 4 |
| 855 | 0.000292 | 0.003626 | 0.000060 | 0.000000 | 0.011679 | 4 |
| 856 | 0.008669 | 0.026669 | 0.000756 | 0.041769 | 0.063989 | 4 |
| 857 | 0.000508 | 0.028051 | 0.000948 | 0.001044 | 0.065072 | 4 |
| 858 | 0.000847 | 0.030292 | 0.003564 | 0.001121 | 0.044291 | 4 |
| 859 | 0.000912 | 0.060977 | 0.002847 | 0.002593 | 0.134512 | 4 |
| 860 | 0.002920 | 0.762394 | 0.007123 | 0.003713 | 0.110362 | 1 |
| 861 | 0.002122 | 0.057673 | 0.002337 | 0.002337 | 0.110340 | 4 |
| 862 | 0.000000 | 0.007004 | 0.002248 | 0.000425 | 0.042285 | 4 |
| 863 | 0.000717 | 0.213377 | 0.001506 | 0.001739 | 0.090751 | 1 |
| 864 | 0.000172 | 0.011484 | 0.019206 | 0.000570 | 0.061231 | 4 |
| 865 | 0.000233 | 0.012227 | 0.000611 | 0.000363 | 0.059183 | 4 |
| 866 | 0.001244 | 0.070038 | 0.004444 | 0.004990 | 0.133472 | 4 |
| 867 | 0.000764 | 0.015097 | 0.001154 | 0.001900 | 0.050821 | 4 |
| 868 | 0.000217 | 0.010484 | 0.001365 | 0.003501 | 0.107897 | 4 |
| 869 | 0.000160 | 0.015189 | 0.068684 | 0.000134 | 0.052168 | 2 |
| 870 | 0.000559 | 0.186696 | 0.001628 | 0.001459 | 0.032244 | 1 |
| 871 | 0.544747 | 0.008588 | 0.010191 | 0.553452 | 0.044964 | 3 |
| 872 | 0.004227 | 0.337677 | 0.002761 | 0.003094 | 0.126345 | 1 |
| 873 | 0.000453 | 0.006404 | 0.424966 | 0.002862 | 0.065920 | 2 |
| 874 | 0.000223 | 0.006551 | 0.008435 | 0.000266 | 0.030188 | 4 |
| 875 | 0.002175 | 0.385922 | 0.008842 | 0.004193 | 0.177521 | 1 |
| 876 | 0.000000 | 0.006548 | 0.000211 | 0.001195 | 0.036219 | 4 |
| 877 | 0.000153 | 0.031041 | 0.003907 | 0.000565 | 0.074326 | 4 |
| 878 | 0.003246 | 0.006057 | 0.019391 | 0.004786 | 0.125717 | 4 |
| 879 | 0.000856 | 0.071915 | 0.004211 | 0.003005 | 0.081098 | 4 |
| 880 | 0.000104 | 0.024044 | 0.003202 | 0.000824 | 0.064379 | 4 |
| 881 | 0.013363 | 0.038726 | 0.003291 | 0.070226 | 0.066759 | 3 |
| 882 | 0.001117 | 0.121983 | 0.014201 | 0.002849 | 0.102979 | 1 |
| 883 | 0.000000 | 0.017791 | 0.001344 | 0.000000 | 0.037171 | 4 |
| 884 | 0.000875 | 0.040434 | 0.002049 | 0.002131 | 0.097778 | 4 |
| 885 | 0.000000 | 0.013283 | 0.000120 | 0.000000 | 0.024157 | 4 |
| 886 | 0.003531 | 0.422367 | 0.020284 | 0.007966 | 0.331130 | 1 |
| 887 | 0.000466 | 0.048768 | 0.002771 | 0.000500 | 0.064468 | 4 |
| 888 | 0.000839 | 0.015052 | 0.004615 | 0.002147 | 0.095086 | 4 |
| 889 | 0.060399 | 0.022699 | 0.005975 | 0.285018 | 0.097361 | 3 |
| 890 | 0.007352 | 0.359417 | 0.004267 | 0.032579 | 0.127274 | 1 |
| 891 | 0.002074 | 0.295135 | 0.015097 | 0.003823 | 0.206924 | 1 |
| 892 | 0.647210 | 0.004947 | 0.008700 | 0.449450 | 0.056894 | 0 |
| 893 | 0.000604 | 0.016144 | 0.000252 | 0.000693 | 0.051121 | 4 |
| 894 | 0.000000 | 0.008804 | 0.094401 | 0.000457 | 0.026491 | 2 |
| 895 | 0.001406 | 0.132799 | 0.008871 | 0.004646 | 0.187246 | 4 |
| 896 | 0.000095 | 0.041713 | 0.004130 | 0.003232 | 0.104925 | 4 |
| 897 | 0.001341 | 0.048337 | 0.007612 | 0.005145 | 0.087892 | 4 |
| 898 | 0.000111 | 0.028759 | 0.006855 | 0.002631 | 0.098134 | 4 |
| 899 | 0.040736 | 0.291321 | 0.094463 | 0.065265 | 0.679788 | 4 |
| 900 | 0.000000 | 0.020217 | 0.001128 | 0.001920 | 0.063760 | 4 |
| 901 | 0.879282 | 0.002464 | 0.001933 | 0.107035 | 0.016968 | 0 |
| 902 | 0.120115 | 0.011653 | 0.007126 | 0.544123 | 0.045756 | 3 |
| 903 | 0.000743 | 0.012224 | 0.004934 | 0.001762 | 0.050401 | 4 |
| 904 | 0.000325 | 0.031258 | 0.001704 | 0.001432 | 0.057183 | 4 |
| 905 | 0.022860 | 0.002363 | 0.001385 | 0.108141 | 0.014647 | 3 |
| 906 | 0.000000 | 0.032905 | 0.002073 | 0.003128 | 0.065701 | 4 |
| 907 | 0.000830 | 0.097061 | 0.007080 | 0.002234 | 0.078473 | 1 |
| 908 | 0.000000 | 0.012452 | 0.000909 | 0.000345 | 0.032163 | 4 |
| 909 | 0.000000 | 0.016760 | 0.000734 | 0.001493 | 0.045303 | 4 |
| 910 | 0.325193 | 0.046532 | 0.009089 | 0.541135 | 0.079210 | 3 |
| 911 | 0.000865 | 0.177788 | 0.003631 | 0.003606 | 0.132616 | 1 |
| 912 | 0.000231 | 0.005663 | 0.000000 | 0.000085 | 0.020347 | 4 |
| 913 | 0.012730 | 0.054069 | 0.003035 | 0.061566 | 0.098928 | 4 |
| 914 | 0.000000 | 0.022810 | 0.000450 | 0.000418 | 0.042744 | 4 |
| 915 | 0.141957 | 0.057519 | 0.005838 | 0.012468 | 0.070273 | 0 |
| 916 | 0.000623 | 0.014331 | 0.007250 | 0.000851 | 0.047263 | 4 |
| 917 | 0.001491 | 0.080023 | 0.300169 | 0.004366 | 0.258587 | 2 |
| 918 | 0.003508 | 0.129389 | 0.010388 | 0.015673 | 0.186038 | 4 |
| 919 | 0.001554 | 0.135510 | 0.000910 | 0.001238 | 0.071175 | 1 |
| 920 | 0.000000 | 0.009466 | 0.001101 | 0.000212 | 0.024290 | 4 |
| 921 | 0.000000 | 0.002927 | 0.001296 | 0.000875 | 0.031414 | 4 |
| 922 | 0.000000 | 0.038307 | 0.001126 | 0.000267 | 0.022892 | 1 |
| 923 | 0.000153 | 0.012909 | 0.525099 | 0.000851 | 0.083405 | 2 |
| 924 | 0.000244 | 0.011491 | 0.000396 | 0.000378 | 0.038951 | 4 |
| 925 | 0.000115 | 0.033503 | 0.001732 | 0.001251 | 0.043011 | 4 |
| 926 | 0.000249 | 0.005402 | 0.000368 | 0.000387 | 0.023907 | 4 |
| 927 | 0.000000 | 0.016886 | 0.000100 | 0.000149 | 0.043464 | 4 |
| 928 | 0.001502 | 0.016807 | 0.000488 | 0.003179 | 0.047568 | 4 |
| 929 | 0.000233 | 0.007350 | 0.002318 | 0.000708 | 0.024542 | 4 |
| 930 | 0.000290 | 0.010982 | 0.000550 | 0.000204 | 0.044925 | 4 |
| 931 | 0.000000 | 0.015323 | 0.000699 | 0.000000 | 0.029760 | 4 |
| 932 | 0.000709 | 0.040613 | 0.001326 | 0.002930 | 0.120718 | 4 |
| 933 | 0.343707 | 0.019200 | 0.010996 | 0.723915 | 0.136435 | 3 |
| 934 | 0.000407 | 0.022389 | 0.003833 | 0.000755 | 0.055141 | 4 |
| 935 | 0.000496 | 0.020289 | 0.005059 | 0.001118 | 0.075119 | 4 |
| 936 | 0.824347 | 0.005423 | 0.002430 | 0.103364 | 0.017407 | 0 |
| 937 | 0.000000 | 0.002409 | 0.564862 | 0.000000 | 0.019616 | 2 |
| 938 | 0.000155 | 0.067727 | 0.003084 | 0.002832 | 0.177401 | 4 |
| 939 | 0.002718 | 0.014417 | 0.014723 | 0.006780 | 0.192926 | 4 |
| 940 | 0.001005 | 0.116841 | 0.000706 | 0.005135 | 0.030445 | 1 |
| 941 | 0.000000 | 0.013936 | 0.015885 | 0.000464 | 0.050417 | 4 |
| 942 | 0.000526 | 0.055439 | 0.007731 | 0.005141 | 0.067235 | 4 |
| 943 | 0.003346 | 0.017040 | 0.010160 | 0.006288 | 0.148424 | 4 |
| 944 | 0.325595 | 0.007296 | 0.013244 | 0.030524 | 0.141744 | 0 |
| 945 | 0.000000 | 0.015980 | 0.082910 | 0.000936 | 0.055760 | 2 |
| 946 | 0.000155 | 0.007183 | 0.003823 | 0.000486 | 0.036766 | 4 |
| 947 | 0.879282 | 0.002464 | 0.001933 | 0.107035 | 0.016968 | 0 |
| 948 | 0.000367 | 0.040018 | 0.002283 | 0.000877 | 0.055353 | 4 |
| 949 | 0.000523 | 0.042913 | 0.005766 | 0.003266 | 0.178125 | 4 |
| 950 | 0.000606 | 0.053819 | 0.002679 | 0.002594 | 0.129477 | 4 |
| 951 | 0.000307 | 0.016423 | 0.005247 | 0.001882 | 0.042625 | 4 |
| 952 | 0.046915 | 0.010492 | 0.004475 | 0.217003 | 0.031670 | 3 |
| 953 | 0.000738 | 0.006029 | 0.000285 | 0.000451 | 0.019246 | 4 |
| 954 | 0.001532 | 0.236235 | 0.001489 | 0.001533 | 0.060713 | 1 |
| 955 | 0.000404 | 0.017329 | 0.043908 | 0.001561 | 0.064656 | 4 |
| 956 | 0.000000 | 0.020530 | 0.001274 | 0.001030 | 0.038488 | 4 |
| 957 | 0.000000 | 0.010890 | 0.000077 | 0.001413 | 0.055603 | 4 |
| 958 | 0.001426 | 0.104416 | 0.001495 | 0.003187 | 0.084637 | 1 |
| 959 | 0.839720 | 0.001256 | 0.004265 | 0.162747 | 0.022326 | 0 |
| 960 | 0.000146 | 0.012738 | 0.003254 | 0.003257 | 0.035983 | 4 |
| 961 | 0.001174 | 0.043498 | 0.003015 | 0.001988 | 0.095749 | 4 |
| 962 | 0.000244 | 0.015975 | 0.000317 | 0.000274 | 0.022372 | 4 |
| 963 | 0.000000 | 0.005931 | 0.000507 | 0.001870 | 0.030265 | 4 |
| 964 | 0.001222 | 0.048621 | 0.010296 | 0.008032 | 0.072177 | 4 |
| 965 | 0.000000 | 0.007516 | 0.001186 | 0.000000 | 0.035945 | 4 |
| 966 | 0.001012 | 0.008859 | 0.049779 | 0.002168 | 0.099696 | 4 |
| 967 | 0.000000 | 0.003355 | 0.007148 | 0.000489 | 0.040637 | 4 |
| 968 | 0.031386 | 0.001693 | 0.000452 | 0.081912 | 0.092230 | 4 |
| 969 | 0.001533 | 0.206291 | 0.018879 | 0.004577 | 0.157582 | 1 |
| 970 | 0.000000 | 0.004229 | 0.002920 | 0.001294 | 0.016455 | 4 |
| 971 | 0.000617 | 0.021800 | 0.002076 | 0.002839 | 0.056140 | 4 |
| 972 | 0.000000 | 0.009716 | 0.000906 | 0.000316 | 0.023557 | 4 |
| 973 | 0.000571 | 0.139884 | 0.001570 | 0.001290 | 0.072907 | 1 |
| 974 | 0.000000 | 0.000992 | 0.000000 | 0.000889 | 0.009222 | 4 |
| 975 | 0.001647 | 0.035736 | 0.014821 | 0.001797 | 0.067372 | 4 |
| 976 | 0.002313 | 0.317715 | 0.006829 | 0.003729 | 0.188490 | 1 |
| 977 | 0.000216 | 0.012236 | 0.021768 | 0.000266 | 0.042061 | 4 |
| 978 | 0.001037 | 0.113786 | 0.006600 | 0.004360 | 0.103117 | 1 |
| 979 | 0.000000 | 0.026652 | 0.002019 | 0.000743 | 0.103657 | 4 |
| 980 | 0.001372 | 0.011909 | 0.002478 | 0.003496 | 0.098083 | 4 |
| 981 | 0.000423 | 0.048839 | 0.004196 | 0.000765 | 0.053918 | 4 |
| 982 | 0.000610 | 0.020471 | 0.421507 | 0.000812 | 0.101046 | 2 |
| 983 | 0.000000 | 0.001609 | 0.007153 | 0.000000 | 0.025021 | 4 |
| 984 | 0.003806 | 0.052055 | 0.002039 | 0.001972 | 0.125715 | 4 |
| 985 | 0.000163 | 0.008448 | 0.006847 | 0.000821 | 0.056555 | 4 |
| 986 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 987 | 0.004107 | 0.011329 | 0.018206 | 0.007062 | 0.167333 | 4 |
| 988 | 0.000557 | 0.033555 | 0.005112 | 0.001902 | 0.084737 | 4 |
| 989 | 0.000000 | 0.003995 | 0.000239 | 0.000913 | 0.019326 | 4 |
| 990 | 0.000542 | 0.057866 | 0.004693 | 0.002051 | 0.133479 | 4 |
| 991 | 0.000702 | 0.039333 | 0.002346 | 0.001946 | 0.126102 | 4 |
| 992 | 0.000419 | 0.147586 | 0.001273 | 0.003663 | 0.048911 | 1 |
| 993 | 0.000000 | 0.008712 | 0.001646 | 0.002086 | 0.047726 | 4 |
| 994 | 0.000000 | 0.062729 | 0.009920 | 0.004337 | 0.174110 | 4 |
| 995 | 0.000736 | 0.027685 | 0.005176 | 0.003121 | 0.097350 | 4 |
| 996 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 997 | 0.000908 | 0.025600 | 0.002387 | 0.000000 | 0.071746 | 4 |
| 998 | 0.001363 | 0.058766 | 0.024143 | 0.003945 | 0.137552 | 4 |
| 999 | 0.001421 | 0.094184 | 0.007003 | 0.003953 | 0.121130 | 4 |
| 1000 | 0.000000 | 0.012088 | 0.001126 | 0.001460 | 0.034232 | 4 |
| 1001 | 0.000000 | 0.014191 | 0.149866 | 0.000000 | 0.040706 | 2 |
| 1002 | 0.000169 | 0.019998 | 0.002244 | 0.000603 | 0.045618 | 4 |
| 1003 | 0.000000 | 0.024975 | 0.002552 | 0.001218 | 0.067272 | 4 |
| 1004 | 0.000337 | 0.003107 | 0.000000 | 0.000000 | 0.009184 | 4 |
| 1005 | 0.000230 | 0.009668 | 0.052622 | 0.001903 | 0.036534 | 2 |
| 1006 | 0.000750 | 0.085563 | 0.184692 | 0.002681 | 0.201494 | 4 |
| 1007 | 0.000251 | 0.111716 | 0.003379 | 0.002403 | 0.107473 | 1 |
| 1008 | 0.000000 | 0.006531 | 0.000000 | 0.000198 | 0.011615 | 4 |
| 1009 | 0.001022 | 0.031458 | 0.000096 | 0.001307 | 0.050643 | 4 |
| 1010 | 0.000121 | 0.033868 | 0.156008 | 0.001467 | 0.107166 | 2 |
| 1011 | 0.001521 | 0.100466 | 0.015343 | 0.003191 | 0.202257 | 4 |
| 1012 | 0.000919 | 0.023940 | 0.000346 | 0.000742 | 0.041188 | 4 |
| 1013 | 0.806201 | 0.003854 | 0.001744 | 0.085014 | 0.014011 | 0 |
| 1014 | 0.001279 | 0.017033 | 0.012111 | 0.001049 | 0.057181 | 4 |
| 1015 | 0.002198 | 0.120227 | 0.007408 | 0.003382 | 0.170389 | 4 |
| 1016 | 0.001079 | 0.218145 | 0.026887 | 0.004835 | 0.192180 | 1 |
| 1017 | 0.000000 | 0.061366 | 0.000828 | 0.002505 | 0.132322 | 4 |
| 1018 | 0.000233 | 0.089672 | 0.005427 | 0.002176 | 0.188149 | 4 |
| 1019 | 0.000747 | 0.248672 | 0.001789 | 0.001127 | 0.046813 | 1 |
| 1020 | 0.000000 | 0.031459 | 0.003053 | 0.001698 | 0.073800 | 4 |
| 1021 | 0.000307 | 0.023429 | 0.000135 | 0.000940 | 0.027399 | 4 |
| 1022 | 0.001150 | 0.025373 | 0.043957 | 0.002950 | 0.153002 | 4 |
| 1023 | 0.000000 | 0.019088 | 0.000336 | 0.000759 | 0.028508 | 4 |
| 1024 | 0.000000 | 0.003118 | 0.006007 | 0.000000 | 0.017976 | 4 |
| 1025 | 0.001191 | 0.125496 | 0.061409 | 0.004470 | 0.159931 | 4 |
| 1026 | 0.000070 | 0.014494 | 0.000206 | 0.000482 | 0.034001 | 4 |
| 1027 | 0.000847 | 0.006362 | 0.000000 | 0.000000 | 0.016249 | 4 |
| 1028 | 0.003217 | 0.057441 | 0.012677 | 0.013488 | 0.168842 | 4 |
| 1029 | 0.002192 | 0.156418 | 0.005005 | 0.003177 | 0.185016 | 4 |
| 1030 | 0.370548 | 0.016247 | 0.006985 | 0.459588 | 0.076179 | 3 |
| 1031 | 0.000962 | 0.318494 | 0.005695 | 0.003149 | 0.115724 | 1 |
| 1032 | 0.755703 | 0.001495 | 0.005111 | 0.177950 | 0.023972 | 0 |
| 1033 | 0.001226 | 0.257730 | 0.007595 | 0.002983 | 0.192958 | 1 |
| 1034 | 0.028792 | 0.001692 | 0.000452 | 0.075467 | 0.087944 | 4 |
| 1035 | 0.000202 | 0.023626 | 0.003078 | 0.002424 | 0.070466 | 4 |
| 1036 | 0.000672 | 0.013502 | 0.000428 | 0.001944 | 0.054781 | 4 |
| 1037 | 0.001597 | 0.014386 | 0.000812 | 0.004852 | 0.067134 | 4 |
| 1038 | 0.000000 | 0.015754 | 0.000941 | 0.000726 | 0.043357 | 4 |
| 1039 | 0.000620 | 0.017421 | 0.124548 | 0.002789 | 0.089582 | 2 |
| 1040 | 0.000000 | 0.008871 | 0.005101 | 0.000423 | 0.035605 | 4 |
| 1041 | 0.612214 | 0.048563 | 0.008764 | 0.364148 | 0.059867 | 0 |
| 1042 | 0.000214 | 0.015609 | 0.000579 | 0.000912 | 0.022851 | 4 |
| 1043 | 0.000877 | 0.228066 | 0.001601 | 0.003106 | 0.074553 | 1 |
| 1044 | 0.000769 | 0.031768 | 0.005022 | 0.001343 | 0.120281 | 4 |
| 1045 | 0.000827 | 0.006133 | 0.055364 | 0.002216 | 0.083062 | 4 |
| 1046 | 0.000380 | 0.034910 | 0.005369 | 0.000908 | 0.122573 | 4 |
| 1047 | 0.000637 | 0.006389 | 0.000487 | 0.001575 | 0.025666 | 4 |
| 1048 | 0.000170 | 0.035505 | 0.005261 | 0.000346 | 0.085804 | 4 |
| 1049 | 0.000000 | 0.011151 | 0.000343 | 0.001643 | 0.021943 | 4 |
| 1050 | 0.000910 | 0.005743 | 0.007624 | 0.000739 | 0.036275 | 4 |
| 1051 | 0.001367 | 0.247144 | 0.003205 | 0.002511 | 0.152537 | 1 |
| 1052 | 0.004407 | 0.380207 | 0.011853 | 0.013604 | 0.158846 | 1 |
| 1053 | 0.000607 | 0.060775 | 0.002244 | 0.001718 | 0.162566 | 4 |
| 1054 | 0.000000 | 0.012307 | 0.001547 | 0.001361 | 0.055434 | 4 |
| 1055 | 0.002333 | 0.006994 | 0.014186 | 0.004309 | 0.126199 | 4 |
| 1056 | 0.197131 | 0.015889 | 0.011706 | 0.914549 | 0.072679 | 3 |
| 1057 | 0.000000 | 0.011830 | 0.000000 | 0.000000 | 0.037427 | 4 |
| 1058 | 0.000597 | 0.024000 | 0.000698 | 0.001354 | 0.035900 | 4 |
| 1059 | 0.001323 | 0.224921 | 0.013770 | 0.006075 | 0.361191 | 4 |
| 1060 | 0.000064 | 0.021555 | 0.000759 | 0.000471 | 0.046951 | 4 |
| 1061 | 0.002020 | 0.014859 | 0.000420 | 0.002891 | 0.042755 | 4 |
| 1062 | 0.096341 | 0.025834 | 0.006118 | 0.369990 | 0.081393 | 3 |
| 1063 | 0.259643 | 0.022927 | 0.031013 | 0.394443 | 0.113613 | 3 |
| 1064 | 0.000000 | 0.004882 | 0.095418 | 0.000000 | 0.039130 | 2 |
| 1065 | 0.007271 | 0.014858 | 0.019256 | 0.036029 | 0.088028 | 4 |
| 1066 | 0.000296 | 0.017362 | 0.001583 | 0.001020 | 0.056085 | 4 |
| 1067 | 0.003156 | 0.604813 | 0.007567 | 0.009445 | 0.165131 | 1 |
| 1068 | 0.001990 | 0.451411 | 0.004972 | 0.006437 | 0.120262 | 1 |
| 1069 | 0.139916 | 0.024525 | 0.011878 | 0.655355 | 0.083460 | 3 |
| 1070 | 0.000472 | 0.153259 | 0.004783 | 0.003343 | 0.129107 | 1 |
| 1071 | 0.000115 | 0.007649 | 0.100085 | 0.001562 | 0.062378 | 2 |
| 1072 | 0.002263 | 0.008249 | 0.032705 | 0.004768 | 0.098284 | 4 |
| 1073 | 0.000000 | 0.021575 | 0.036663 | 0.000569 | 0.050686 | 4 |
| 1074 | 0.000000 | 0.011489 | 0.000487 | 0.000416 | 0.034490 | 4 |
| 1075 | 0.000141 | 0.018886 | 0.001127 | 0.000459 | 0.048966 | 4 |
| 1076 | 0.004124 | 0.046105 | 0.001680 | 0.001485 | 0.101463 | 4 |
| 1077 | 0.000000 | 0.016317 | 0.405768 | 0.000112 | 0.075550 | 2 |
| 1078 | 0.145212 | 0.020976 | 0.010196 | 0.687057 | 0.059423 | 3 |
| 1079 | 0.000549 | 0.025641 | 0.004524 | 0.000358 | 0.062204 | 4 |
| 1080 | 0.001002 | 0.051270 | 0.027939 | 0.003036 | 0.189881 | 4 |
| 1081 | 0.000458 | 0.015609 | 0.000946 | 0.001318 | 0.085284 | 4 |
| 1082 | 0.000590 | 0.007854 | 0.001125 | 0.000000 | 0.016516 | 4 |
| 1083 | 0.000089 | 0.031298 | 0.000883 | 0.001411 | 0.066472 | 4 |
| 1084 | 0.000000 | 0.005473 | 0.001525 | 0.000431 | 0.030551 | 4 |
| 1085 | 0.000289 | 0.064223 | 0.001761 | 0.004020 | 0.068474 | 4 |
| 1086 | 0.000000 | 0.003706 | 0.000562 | 0.000675 | 0.027562 | 4 |
| 1087 | 0.000852 | 0.047318 | 0.006432 | 0.002318 | 0.188157 | 4 |
| 1088 | 0.906915 | 0.009418 | 0.006177 | 0.411120 | 0.045927 | 0 |
| 1089 | 0.018622 | 0.040311 | 0.005255 | 0.090391 | 0.103866 | 4 |
| 1090 | 0.003914 | 0.319463 | 0.028071 | 0.011313 | 0.260478 | 1 |
| 1091 | 0.000000 | 0.010453 | 0.008192 | 0.000165 | 0.049348 | 4 |
| 1092 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 1093 | 0.000247 | 0.007568 | 0.000545 | 0.000585 | 0.017867 | 4 |
| 1094 | 0.191875 | 0.024200 | 0.005754 | 0.133884 | 0.100086 | 0 |
| 1095 | 0.000000 | 0.006734 | 0.000147 | 0.001391 | 0.026625 | 4 |
| 1096 | 0.000413 | 0.052943 | 0.004130 | 0.003416 | 0.081700 | 4 |
| 1097 | 0.002945 | 0.688471 | 0.005002 | 0.005702 | 0.153124 | 1 |
| 1098 | 0.001147 | 0.231610 | 0.020554 | 0.003280 | 0.212562 | 1 |
| 1099 | 0.280086 | 0.011480 | 0.011605 | 0.917087 | 0.063477 | 3 |
| 1100 | 0.000460 | 0.038979 | 0.002273 | 0.001296 | 0.098547 | 4 |
| 1101 | 0.001262 | 0.009183 | 0.003034 | 0.001975 | 0.037223 | 4 |
| 1102 | 0.000180 | 0.023306 | 0.001690 | 0.000422 | 0.044990 | 4 |
| 1103 | 0.000295 | 0.020695 | 0.128682 | 0.002948 | 0.130156 | 4 |
| 1104 | 0.000000 | 0.009341 | 0.003213 | 0.004531 | 0.123269 | 4 |
| 1105 | 0.000957 | 0.148866 | 0.047305 | 0.002114 | 0.126379 | 1 |
| 1106 | 0.421848 | 0.010553 | 0.011053 | 0.628489 | 0.050024 | 3 |
| 1107 | 0.000000 | 0.014520 | 0.000000 | 0.001278 | 0.019724 | 4 |
| 1108 | 0.000000 | 0.018782 | 0.000870 | 0.003552 | 0.042360 | 4 |
| 1109 | 0.004337 | 0.096666 | 0.042931 | 0.015118 | 0.192817 | 4 |
| 1110 | 0.001914 | 0.243886 | 0.003462 | 0.004741 | 0.075301 | 1 |
| 1111 | 0.000382 | 0.007699 | 0.306788 | 0.002360 | 0.071975 | 2 |
| 1112 | 0.182576 | 0.009582 | 0.011803 | 0.833503 | 0.075082 | 3 |
| 1113 | 0.012475 | 0.000491 | 0.000000 | 0.024336 | 0.097405 | 4 |
| 1114 | 0.163817 | 0.051064 | 0.011444 | 0.663280 | 0.144620 | 3 |
| 1115 | 0.000707 | 0.021185 | 0.000847 | 0.001189 | 0.040866 | 4 |
| 1116 | 0.000000 | 0.024228 | 0.017666 | 0.002124 | 0.079992 | 4 |
| 1117 | 0.000000 | 0.009893 | 0.008537 | 0.000275 | 0.048492 | 4 |
| 1118 | 0.000000 | 0.006015 | 0.000175 | 0.000000 | 0.020266 | 4 |
| 1119 | 0.000132 | 0.007831 | 0.552528 | 0.001188 | 0.065678 | 2 |
| 1120 | 0.000392 | 0.055810 | 0.014174 | 0.003859 | 0.162173 | 4 |
| 1121 | 0.000466 | 0.007823 | 0.013111 | 0.004352 | 0.048240 | 4 |
| 1122 | 0.000800 | 0.194767 | 0.001010 | 0.002137 | 0.053373 | 1 |
| 1123 | 0.000216 | 0.008321 | 0.029744 | 0.001668 | 0.078342 | 4 |
| 1124 | 0.001588 | 0.131611 | 0.019174 | 0.005215 | 0.352521 | 4 |
| 1125 | 0.000000 | 0.015217 | 0.367935 | 0.000000 | 0.057133 | 2 |
| 1126 | 0.000907 | 0.006445 | 0.000628 | 0.000000 | 0.019714 | 4 |
| 1127 | 0.002261 | 0.266326 | 0.064706 | 0.007030 | 0.276945 | 4 |
| 1128 | 0.003195 | 0.227622 | 0.034325 | 0.006925 | 0.351136 | 4 |
| 1129 | 0.000000 | 0.008143 | 0.016503 | 0.000334 | 0.025737 | 4 |
| 1130 | 0.000000 | 0.042091 | 0.000819 | 0.001173 | 0.079843 | 4 |
| 1131 | 0.000084 | 0.020772 | 0.055905 | 0.001140 | 0.107594 | 4 |
| 1132 | 0.000136 | 0.032558 | 0.003357 | 0.001092 | 0.085825 | 4 |
| 1133 | 0.001626 | 0.198842 | 0.016771 | 0.002273 | 0.098392 | 1 |
| 1134 | 0.000310 | 0.015331 | 0.001811 | 0.002719 | 0.052068 | 4 |
| 1135 | 0.001123 | 0.017413 | 0.100818 | 0.002810 | 0.119307 | 4 |
| 1136 | 0.000000 | 0.016661 | 0.001508 | 0.002613 | 0.053379 | 4 |
| 1137 | 0.000011 | 0.027735 | 0.001436 | 0.001129 | 0.084218 | 4 |
| 1138 | 0.000580 | 0.018354 | 0.001363 | 0.000303 | 0.036350 | 4 |
| 1139 | 0.000134 | 0.018676 | 0.001725 | 0.000479 | 0.042250 | 4 |
| 1140 | 0.000763 | 0.108898 | 0.004841 | 0.005131 | 0.214975 | 4 |
| 1141 | 0.000000 | 0.021030 | 0.000829 | 0.001467 | 0.046553 | 4 |
| 1142 | 0.000317 | 0.065797 | 0.008320 | 0.002570 | 0.159946 | 4 |
| 1143 | 0.705742 | 0.007338 | 0.011220 | 0.539548 | 0.053935 | 0 |
| 1144 | 0.000428 | 0.030797 | 0.001098 | 0.001582 | 0.057528 | 4 |
| 1145 | 0.003693 | 0.832917 | 0.006526 | 0.008755 | 0.135674 | 1 |
| 1146 | 0.001037 | 0.223791 | 0.003271 | 0.002407 | 0.089703 | 1 |
| 1147 | 0.006728 | 0.020634 | 0.029441 | 0.031184 | 0.072954 | 4 |
| 1148 | 0.157460 | 0.009831 | 0.011299 | 0.460999 | 0.073096 | 3 |
| 1149 | 0.000000 | 0.006919 | 0.001957 | 0.002303 | 0.019649 | 4 |
| 1150 | 0.000848 | 0.036038 | 0.001888 | 0.005746 | 0.050937 | 4 |
| 1151 | 0.001080 | 0.045561 | 0.025958 | 0.006964 | 0.106696 | 4 |
| 1152 | 0.000429 | 0.024399 | 0.017204 | 0.004474 | 0.099708 | 4 |
| 1153 | 0.000444 | 0.020470 | 0.002537 | 0.003744 | 0.085453 | 4 |
| 1154 | 0.000800 | 0.040514 | 0.010005 | 0.006045 | 0.062940 | 4 |
| 1155 | 0.000156 | 0.031022 | 0.003180 | 0.000584 | 0.047519 | 4 |
| 1156 | 0.001985 | 0.084924 | 0.000858 | 0.001393 | 0.105703 | 4 |
| 1157 | 0.171765 | 0.005121 | 0.004658 | 0.343536 | 0.131942 | 3 |
| 1158 | 0.002761 | 0.062500 | 0.002032 | 0.002253 | 0.152899 | 4 |
| 1159 | 0.000316 | 0.105301 | 0.000806 | 0.000813 | 0.028875 | 1 |
| 1160 | 0.000126 | 0.033990 | 0.001214 | 0.001314 | 0.055295 | 4 |
| 1161 | 0.000774 | 0.020262 | 0.005478 | 0.000000 | 0.074221 | 4 |
| 1162 | 0.424714 | 0.007128 | 0.009247 | 0.596005 | 0.073313 | 3 |
| 1163 | 0.033392 | 0.158455 | 0.005530 | 0.150391 | 0.116164 | 1 |
| 1164 | 0.050932 | 0.106903 | 0.016127 | 0.011451 | 0.179125 | 4 |
| 1165 | 0.007147 | 0.040919 | 0.001883 | 0.031468 | 0.066097 | 4 |
| 1166 | 0.000050 | 0.022711 | 0.002968 | 0.000891 | 0.087683 | 4 |
| 1167 | 0.001532 | 0.074875 | 0.006162 | 0.006780 | 0.139843 | 4 |
| 1168 | 0.000840 | 0.146427 | 0.004772 | 0.003513 | 0.085505 | 1 |
| 1169 | 0.000321 | 0.019484 | 0.000363 | 0.003165 | 0.050302 | 4 |
| 1170 | 0.001924 | 0.198277 | 0.016328 | 0.007212 | 0.257814 | 4 |
| 1171 | 0.000310 | 0.069493 | 0.004491 | 0.003062 | 0.220592 | 4 |
| 1172 | 0.828483 | 0.012299 | 0.009020 | 0.528022 | 0.058020 | 0 |
| 1173 | 0.000000 | 0.013452 | 0.092730 | 0.001112 | 0.064034 | 2 |
| 1174 | 0.000500 | 0.022519 | 0.000208 | 0.000521 | 0.034221 | 4 |
| 1175 | 0.000942 | 0.079284 | 0.013251 | 0.002942 | 0.174112 | 4 |
| 1176 | 0.000000 | 0.003055 | 0.008766 | 0.000702 | 0.021941 | 4 |
| 1177 | 0.166457 | 0.101901 | 0.007219 | 0.116275 | 0.150253 | 0 |
| 1178 | 0.000099 | 0.017943 | 0.000426 | 0.001764 | 0.063068 | 4 |
| 1179 | 0.001588 | 0.151252 | 0.039388 | 0.006835 | 0.311454 | 4 |
| 1180 | 0.000558 | 0.017374 | 0.006454 | 0.001534 | 0.060262 | 4 |
| 1181 | 0.001973 | 0.063780 | 0.002165 | 0.003060 | 0.130464 | 4 |
| 1182 | 0.000470 | 0.003639 | 0.000096 | 0.001163 | 0.022435 | 4 |
| 1183 | 0.001194 | 0.028455 | 0.005258 | 0.001353 | 0.106225 | 4 |
| 1184 | 0.003521 | 0.061508 | 0.002371 | 0.003067 | 0.131169 | 4 |
| 1185 | 0.164961 | 0.016969 | 0.009859 | 0.763412 | 0.056576 | 3 |
| 1186 | 0.003588 | 0.120980 | 0.003428 | 0.015242 | 0.133167 | 4 |
| 1187 | 0.002890 | 0.335424 | 0.008520 | 0.010444 | 0.220724 | 1 |
| 1188 | 0.000825 | 0.099084 | 0.001446 | 0.002466 | 0.109318 | 4 |
| 1189 | 0.207289 | 0.014561 | 0.012234 | 0.947525 | 0.059128 | 3 |
| 1190 | 0.000744 | 0.047331 | 0.011848 | 0.002158 | 0.133960 | 4 |
| 1191 | 0.001093 | 0.041528 | 0.002081 | 0.003272 | 0.082006 | 4 |
| 1192 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 1193 | 0.000846 | 0.013360 | 0.002122 | 0.004286 | 0.096032 | 4 |
| 1194 | 0.000159 | 0.029611 | 0.003893 | 0.001115 | 0.079770 | 4 |
| 1195 | 0.000325 | 0.046663 | 0.001162 | 0.000312 | 0.040914 | 1 |
| 1196 | 0.000000 | 0.025042 | 0.013658 | 0.000419 | 0.043198 | 4 |
| 1197 | 0.002329 | 0.088115 | 0.015911 | 0.009030 | 0.197882 | 4 |
| 1198 | 0.000265 | 0.009699 | 0.001755 | 0.001923 | 0.038762 | 4 |
| 1199 | 0.289883 | 0.013663 | 0.012187 | 0.923353 | 0.067974 | 3 |
| 1200 | 0.000000 | 0.019433 | 0.000307 | 0.000981 | 0.053855 | 4 |
| 1201 | 0.002073 | 0.159954 | 0.029177 | 0.006120 | 0.258459 | 4 |
| 1202 | 0.000740 | 0.160415 | 0.006474 | 0.003015 | 0.085389 | 1 |
| 1203 | 0.000000 | 0.001749 | 0.007258 | 0.000645 | 0.045982 | 4 |
| 1204 | 0.001390 | 0.019636 | 0.024414 | 0.003534 | 0.077228 | 4 |
| 1205 | 0.000269 | 0.020306 | 0.001811 | 0.000275 | 0.043139 | 4 |
| 1206 | 0.000046 | 0.033917 | 0.018972 | 0.001767 | 0.117758 | 4 |
| 1207 | 0.000133 | 0.041857 | 0.001691 | 0.001472 | 0.093987 | 4 |
| 1208 | 0.002209 | 0.114615 | 0.006799 | 0.002731 | 0.099981 | 1 |
| 1209 | 0.000963 | 0.039896 | 0.002352 | 0.007490 | 0.064843 | 4 |
| 1210 | 0.000398 | 0.031607 | 0.000226 | 0.000000 | 0.062757 | 4 |
| 1211 | 0.001191 | 0.042029 | 0.000217 | 0.000822 | 0.029656 | 1 |
| 1212 | 0.453899 | 0.000357 | 0.003903 | 0.156152 | 0.057301 | 0 |
| 1213 | 0.001214 | 0.040838 | 0.011546 | 0.002232 | 0.128873 | 4 |
| 1214 | 0.000000 | 0.003157 | 0.000631 | 0.000245 | 0.019267 | 4 |
| 1215 | 0.001528 | 0.079033 | 0.003603 | 0.011357 | 0.105694 | 4 |
| 1216 | 0.000000 | 0.005915 | 0.291393 | 0.000000 | 0.045029 | 2 |
| 1217 | 0.000589 | 0.078545 | 0.009063 | 0.003694 | 0.224145 | 4 |
| 1218 | 0.000557 | 0.027931 | 0.063445 | 0.001010 | 0.092570 | 4 |
| 1219 | 0.000199 | 0.004898 | 0.189430 | 0.001259 | 0.051851 | 2 |
| 1220 | 0.209351 | 0.008901 | 0.006653 | 0.416350 | 0.038879 | 3 |
| 1221 | 0.002535 | 0.475464 | 0.017957 | 0.007518 | 0.189410 | 1 |
| 1222 | 0.000000 | 0.005519 | 0.005111 | 0.000260 | 0.021395 | 4 |
| 1223 | 0.000283 | 0.015268 | 0.003189 | 0.005588 | 0.162086 | 4 |
| 1224 | 0.000000 | 0.008499 | 0.001100 | 0.000168 | 0.032001 | 4 |
| 1225 | 0.001630 | 0.001096 | 0.000000 | 0.002591 | 0.015086 | 4 |
| 1226 | 0.037018 | 0.001283 | 0.001663 | 0.105782 | 0.090031 | 3 |
| 1227 | 0.000000 | 0.018105 | 0.000916 | 0.000840 | 0.040013 | 4 |
| 1228 | 0.000732 | 0.199044 | 0.004439 | 0.000977 | 0.065358 | 1 |
| 1229 | 0.004260 | 0.004949 | 0.001922 | 0.006101 | 0.029268 | 4 |
| 1230 | 0.177120 | 0.024458 | 0.013479 | 0.835271 | 0.079693 | 3 |
| 1231 | 0.000149 | 0.009460 | 0.002878 | 0.000958 | 0.034570 | 4 |
| 1232 | 0.001443 | 0.057770 | 0.085504 | 0.006133 | 0.192149 | 4 |
| 1233 | 0.002900 | 0.020370 | 0.001626 | 0.003885 | 0.074612 | 4 |
| 1234 | 0.000000 | 0.010565 | 0.001085 | 0.001579 | 0.049143 | 4 |
| 1235 | 0.001493 | 0.332367 | 0.005308 | 0.004351 | 0.128569 | 1 |
| 1236 | 0.001081 | 0.108510 | 0.004264 | 0.004444 | 0.186726 | 4 |
| 1237 | 0.145238 | 0.036698 | 0.015851 | 0.385369 | 0.080461 | 3 |
| 1238 | 0.000302 | 0.021980 | 0.000873 | 0.002092 | 0.076208 | 4 |
| 1239 | 0.001815 | 0.085353 | 0.001915 | 0.001057 | 0.034211 | 1 |
| 1240 | 0.003333 | 0.198741 | 0.008145 | 0.006021 | 0.201243 | 4 |
| 1241 | 0.000000 | 0.011669 | 0.000630 | 0.000963 | 0.022482 | 4 |
| 1242 | 0.000000 | 0.012590 | 0.028203 | 0.000263 | 0.055567 | 4 |
| 1243 | 0.000734 | 0.008909 | 0.001187 | 0.002666 | 0.042024 | 4 |
| 1244 | 0.002091 | 0.092851 | 0.004461 | 0.001450 | 0.097224 | 4 |
| 1245 | 0.000564 | 0.009251 | 0.500660 | 0.000546 | 0.042643 | 2 |
| 1246 | 0.000064 | 0.019508 | 0.097905 | 0.001781 | 0.083279 | 2 |
| 1247 | 0.000000 | 0.002727 | 0.000509 | 0.001465 | 0.025387 | 4 |
| 1248 | 0.000000 | 0.010045 | 0.000212 | 0.001493 | 0.030296 | 4 |
| 1249 | 0.002425 | 0.652104 | 0.004797 | 0.005161 | 0.149400 | 1 |
| 1250 | 0.015614 | 0.132153 | 0.016482 | 0.065513 | 0.353261 | 4 |
| 1251 | 0.005582 | 0.044634 | 0.007408 | 0.024245 | 0.093918 | 4 |
| 1252 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 1253 | 0.000000 | 0.003690 | 0.640230 | 0.000000 | 0.018324 | 2 |
| 1254 | 0.000000 | 0.017734 | 0.000256 | 0.000856 | 0.043548 | 4 |
| 1255 | 0.002818 | 0.005910 | 0.001270 | 0.000275 | 0.014399 | 4 |
| 1256 | 0.001608 | 0.101255 | 0.004628 | 0.001463 | 0.070232 | 1 |
| 1257 | 0.000000 | 0.003403 | 0.402218 | 0.000000 | 0.036687 | 2 |
| 1258 | 0.000155 | 0.010519 | 0.000000 | 0.000000 | 0.022417 | 4 |
| 1259 | 0.000000 | 0.009463 | 0.068786 | 0.001786 | 0.081119 | 4 |
| 1260 | 0.186057 | 0.188712 | 0.012313 | 0.881180 | 0.070864 | 3 |
| 1261 | 0.000000 | 0.002056 | 0.000000 | 0.001060 | 0.008457 | 4 |
| 1262 | 0.000000 | 0.002549 | 0.011407 | 0.000000 | 0.032777 | 4 |
| 1263 | 0.000130 | 0.014395 | 0.016787 | 0.001374 | 0.065996 | 4 |
| 1264 | 0.007117 | 0.008173 | 0.000652 | 0.036764 | 0.014435 | 3 |
| 1265 | 0.000053 | 0.008005 | 0.004568 | 0.000704 | 0.035415 | 4 |
| 1266 | 0.003229 | 0.018481 | 0.013154 | 0.005551 | 0.161897 | 4 |
| 1267 | 0.567038 | 0.011340 | 0.011023 | 0.655251 | 0.065149 | 3 |
| 1268 | 0.000266 | 0.011759 | 0.000077 | 0.001493 | 0.031590 | 4 |
| 1269 | 0.849334 | 0.002944 | 0.002201 | 0.113385 | 0.018153 | 0 |
| 1270 | 0.467901 | 0.010140 | 0.012300 | 0.764340 | 0.059957 | 3 |
| 1271 | 0.000730 | 0.023366 | 0.000424 | 0.001732 | 0.035794 | 4 |
| 1272 | 0.001269 | 0.357852 | 0.003950 | 0.003502 | 0.132530 | 1 |
| 1273 | 0.000963 | 0.115225 | 0.000836 | 0.002373 | 0.050842 | 1 |
| 1274 | 0.000483 | 0.054753 | 0.020451 | 0.003255 | 0.146382 | 4 |
| 1275 | 0.000709 | 0.026486 | 0.006917 | 0.000489 | 0.068793 | 4 |
| 1276 | 0.001286 | 0.006263 | 0.007063 | 0.002386 | 0.077010 | 4 |
| 1277 | 0.001195 | 0.104183 | 0.006509 | 0.002670 | 0.133806 | 4 |
| 1278 | 0.000407 | 0.132525 | 0.005575 | 0.003057 | 0.173840 | 4 |
| 1279 | 0.843960 | 0.004472 | 0.005768 | 0.186393 | 0.026112 | 0 |
| 1280 | 0.000278 | 0.010748 | 0.409671 | 0.001301 | 0.078535 | 2 |
| 1281 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 1282 | 0.000090 | 0.010523 | 0.051903 | 0.000000 | 0.060196 | 4 |
| 1283 | 0.000513 | 0.020915 | 0.001780 | 0.000814 | 0.043335 | 4 |
| 1284 | 0.000037 | 0.006627 | 0.009125 | 0.001152 | 0.028398 | 4 |
| 1285 | 0.000056 | 0.027535 | 0.001669 | 0.002710 | 0.073870 | 4 |
| 1286 | 0.000000 | 0.019111 | 0.000211 | 0.001503 | 0.055047 | 4 |
| 1287 | 0.000340 | 0.032022 | 0.002579 | 0.002585 | 0.136555 | 4 |
| 1288 | 0.000000 | 0.005478 | 0.001136 | 0.000173 | 0.024198 | 4 |
| 1289 | 0.000347 | 0.025658 | 0.000748 | 0.002791 | 0.078048 | 4 |
| 1290 | 0.726133 | 0.014157 | 0.011088 | 0.810578 | 0.076598 | 3 |
| 1291 | 0.000886 | 0.033077 | 0.007683 | 0.001263 | 0.087974 | 4 |
| 1292 | 0.813780 | 0.014369 | 0.008130 | 0.523803 | 0.060214 | 0 |
| 1293 | 0.000000 | 0.005375 | 0.000000 | 0.001418 | 0.017994 | 4 |
| 1294 | 0.000380 | 0.027379 | 0.000038 | 0.001375 | 0.032976 | 4 |
| 1295 | 0.000188 | 0.017968 | 0.000295 | 0.000973 | 0.063664 | 4 |
| 1296 | 0.001881 | 0.078863 | 0.012391 | 0.002958 | 0.105892 | 4 |
| 1297 | 0.000306 | 0.024997 | 0.002039 | 0.004915 | 0.077480 | 4 |
| 1298 | 0.001397 | 0.059859 | 0.015398 | 0.002810 | 0.143377 | 4 |
| 1299 | 0.000795 | 0.172666 | 0.003302 | 0.004388 | 0.145198 | 1 |
| 1300 | 0.044367 | 0.001302 | 0.002520 | 0.130431 | 0.131393 | 4 |
| 1301 | 0.000372 | 0.041495 | 0.002520 | 0.001014 | 0.091364 | 4 |
| 1302 | 0.001025 | 0.090956 | 0.020150 | 0.002095 | 0.101716 | 4 |
| 1303 | 0.000524 | 0.011297 | 0.000000 | 0.001185 | 0.029403 | 4 |
| 1304 | 0.000495 | 0.152291 | 0.001717 | 0.002610 | 0.112652 | 1 |
| 1305 | 0.001196 | 0.030123 | 0.004219 | 0.002735 | 0.088611 | 4 |
| 1306 | 0.001025 | 0.246365 | 0.004681 | 0.003724 | 0.072901 | 1 |
| 1307 | 0.000949 | 0.266012 | 0.002125 | 0.003372 | 0.090384 | 1 |
| 1308 | 0.000680 | 0.020937 | 0.003309 | 0.004727 | 0.088200 | 4 |
| 1309 | 0.001968 | 0.237108 | 0.005245 | 0.004925 | 0.149381 | 1 |
| 1310 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 1311 | 0.000453 | 0.006404 | 0.424966 | 0.002862 | 0.065920 | 2 |
| 1312 | 0.000604 | 0.010580 | 0.001569 | 0.001176 | 0.033931 | 4 |
| 1313 | 0.000999 | 0.066209 | 0.002657 | 0.006907 | 0.095982 | 4 |
| 1314 | 0.000391 | 0.004387 | 0.199726 | 0.001787 | 0.030010 | 2 |
| 1315 | 0.002904 | 0.289759 | 0.048272 | 0.009339 | 0.374409 | 4 |
| 1316 | 0.000413 | 0.030976 | 0.005505 | 0.002231 | 0.130809 | 4 |
| 1317 | 0.000000 | 0.015117 | 0.000219 | 0.000502 | 0.046492 | 4 |
| 1318 | 0.000886 | 0.070063 | 0.003100 | 0.000796 | 0.071867 | 4 |
| 1319 | 0.002297 | 0.013224 | 0.002077 | 0.000910 | 0.039587 | 4 |
| 1320 | 0.000130 | 0.015078 | 0.000740 | 0.001517 | 0.032385 | 4 |
| 1321 | 0.000541 | 0.023722 | 0.001290 | 0.001691 | 0.100864 | 4 |
| 1322 | 0.001282 | 0.122779 | 0.019939 | 0.005496 | 0.230951 | 4 |
| 1323 | 0.000568 | 0.007788 | 0.002691 | 0.000000 | 0.032850 | 4 |
| 1324 | 0.005443 | 0.136707 | 0.003950 | 0.025205 | 0.065824 | 1 |
| 1325 | 0.000000 | 0.005182 | 0.000000 | 0.000000 | 0.017974 | 4 |
| 1326 | 0.000000 | 0.012170 | 0.000736 | 0.001137 | 0.038835 | 4 |
| 1327 | 0.000000 | 0.004765 | 0.012528 | 0.000202 | 0.025035 | 4 |
| 1328 | 0.000461 | 0.019824 | 0.002059 | 0.000541 | 0.066296 | 4 |
| 1329 | 0.000648 | 0.025245 | 0.011086 | 0.004661 | 0.077225 | 4 |
| 1330 | 0.000159 | 0.026093 | 0.031185 | 0.001670 | 0.101317 | 4 |
| 1331 | 0.000174 | 0.015444 | 0.175905 | 0.001099 | 0.066390 | 2 |
| 1332 | 0.170438 | 0.008359 | 0.006024 | 0.460295 | 0.042331 | 3 |
| 1333 | 0.000000 | 0.008619 | 0.433514 | 0.001046 | 0.074719 | 2 |
| 1334 | 0.001058 | 0.020616 | 0.002550 | 0.001549 | 0.072205 | 4 |
| 1335 | 0.238465 | 0.007801 | 0.006714 | 0.426528 | 0.044919 | 3 |
| 1336 | 0.000510 | 0.173819 | 0.001384 | 0.000943 | 0.042714 | 1 |
| 1337 | 0.000950 | 0.039412 | 0.015100 | 0.003967 | 0.120123 | 4 |
| 1338 | 0.000000 | 0.012633 | 0.001585 | 0.000899 | 0.054030 | 4 |
| 1339 | 0.000000 | 0.013780 | 0.001054 | 0.000468 | 0.031654 | 4 |
| 1340 | 0.000000 | 0.006779 | 0.000965 | 0.002183 | 0.027331 | 4 |
| 1341 | 0.001361 | 0.305236 | 0.013925 | 0.004490 | 0.197868 | 1 |
| 1342 | 0.000000 | 0.013140 | 0.006895 | 0.000517 | 0.028706 | 4 |
| 1343 | 0.000625 | 0.021805 | 0.016930 | 0.001034 | 0.063588 | 4 |
| 1344 | 0.002228 | 0.167060 | 0.015516 | 0.006382 | 0.350707 | 4 |
| 1345 | 0.000987 | 0.103808 | 0.014095 | 0.002820 | 0.208854 | 4 |
| 1346 | 0.000271 | 0.015480 | 0.000000 | 0.000945 | 0.021124 | 4 |
| 1347 | 0.000266 | 0.027767 | 0.002498 | 0.002650 | 0.089529 | 4 |
| 1348 | 0.030976 | 0.001646 | 0.000796 | 0.082842 | 0.126520 | 4 |
| 1349 | 0.000453 | 0.006404 | 0.424966 | 0.002862 | 0.065920 | 2 |
| 1350 | 0.003381 | 0.333673 | 0.017778 | 0.009204 | 0.286481 | 1 |
| 1351 | 0.001421 | 0.144569 | 0.018387 | 0.004532 | 0.184908 | 4 |
| 1352 | 0.000000 | 0.025809 | 0.001115 | 0.001033 | 0.066447 | 4 |
| 1353 | 0.000000 | 0.018831 | 0.014794 | 0.000978 | 0.049276 | 4 |
| 1354 | 0.000000 | 0.017180 | 0.373114 | 0.000000 | 0.071832 | 2 |
| 1355 | 0.000000 | 0.002644 | 0.000000 | 0.000000 | 0.028345 | 4 |
| 1356 | 0.000570 | 0.027387 | 0.000868 | 0.000692 | 0.042828 | 4 |
| 1357 | 0.001366 | 0.086557 | 0.039141 | 0.004527 | 0.205424 | 4 |
| 1358 | 0.000860 | 0.028639 | 0.002580 | 0.000493 | 0.102600 | 4 |
| 1359 | 0.000461 | 0.009890 | 0.001547 | 0.002782 | 0.036876 | 4 |
| 1360 | 0.000147 | 0.048139 | 0.004272 | 0.001050 | 0.051161 | 4 |
| 1361 | 0.001094 | 0.017041 | 0.001470 | 0.003085 | 0.054232 | 4 |
| 1362 | 0.000312 | 0.045210 | 0.001378 | 0.000960 | 0.129801 | 4 |
| 1363 | 0.000311 | 0.025271 | 0.000613 | 0.000820 | 0.065079 | 4 |
| 1364 | 0.010853 | 0.263270 | 0.035140 | 0.012724 | 0.513716 | 4 |
| 1365 | 0.006514 | 0.085637 | 0.006176 | 0.031683 | 0.225305 | 4 |
| 1366 | 0.000924 | 0.013046 | 0.140248 | 0.002261 | 0.090141 | 2 |
| 1367 | 0.000698 | 0.015094 | 0.015076 | 0.001481 | 0.103357 | 4 |
| 1368 | 0.000000 | 0.011057 | 0.000460 | 0.001541 | 0.028378 | 4 |
| 1369 | 0.000774 | 0.014884 | 0.000794 | 0.002155 | 0.080640 | 4 |
| 1370 | 0.000186 | 0.010431 | 0.008103 | 0.001145 | 0.027192 | 4 |
| 1371 | 0.000000 | 0.002868 | 0.024942 | 0.001881 | 0.030889 | 4 |
| 1372 | 0.000826 | 0.003708 | 0.000000 | 0.000000 | 0.010976 | 4 |
| 1373 | 0.000000 | 0.022009 | 0.002105 | 0.002716 | 0.046697 | 4 |
| 1374 | 0.001262 | 0.100120 | 0.004337 | 0.003077 | 0.117203 | 4 |
| 1375 | 0.000000 | 0.004768 | 0.001369 | 0.000000 | 0.024925 | 4 |
| 1376 | 0.661569 | 0.004234 | 0.002118 | 0.113704 | 0.025484 | 0 |
| 1377 | 0.001648 | 0.025950 | 0.014629 | 0.002926 | 0.093133 | 4 |
| 1378 | 0.046569 | 0.503771 | 0.009248 | 0.104604 | 0.242142 | 1 |
| 1379 | 0.000170 | 0.061427 | 0.007602 | 0.002732 | 0.161605 | 4 |
| 1380 | 0.000000 | 0.003906 | 0.000765 | 0.000000 | 0.038590 | 4 |
| 1381 | 0.001156 | 0.161569 | 0.001808 | 0.000370 | 0.040804 | 1 |
| 1382 | 0.000846 | 0.086281 | 0.026838 | 0.002500 | 0.166515 | 4 |
| 1383 | 0.000562 | 0.016451 | 0.001103 | 0.001404 | 0.064166 | 4 |
| 1384 | 0.002523 | 0.331714 | 0.004743 | 0.011661 | 0.143199 | 1 |
| 1385 | 0.000000 | 0.014009 | 0.000733 | 0.000586 | 0.025429 | 4 |
| 1386 | 0.000158 | 0.010420 | 0.001067 | 0.000561 | 0.034027 | 4 |
| 1387 | 0.000237 | 0.012807 | 0.001338 | 0.000569 | 0.036836 | 4 |
| 1388 | 0.004470 | 0.004182 | 0.007445 | 0.002520 | 0.077644 | 4 |
| 1389 | 0.002101 | 0.207300 | 0.007160 | 0.004224 | 0.246905 | 4 |
| 1390 | 0.000309 | 0.037485 | 0.002076 | 0.002827 | 0.128393 | 4 |
| 1391 | 0.000588 | 0.040305 | 0.001328 | 0.003592 | 0.085296 | 4 |
| 1392 | 0.001222 | 0.005102 | 0.000869 | 0.000000 | 0.018912 | 4 |
| 1393 | 0.001317 | 0.465008 | 0.004241 | 0.003453 | 0.149934 | 1 |
| 1394 | 0.235273 | 0.010170 | 0.009461 | 0.667464 | 0.046764 | 3 |
| 1395 | 0.001660 | 0.200870 | 0.056995 | 0.005440 | 0.234499 | 4 |
| 1396 | 0.469376 | 0.014514 | 0.012770 | 0.890830 | 0.067598 | 3 |
| 1397 | 0.000452 | 0.003131 | 0.000000 | 0.001234 | 0.018228 | 4 |
| 1398 | 0.000000 | 0.004383 | 0.006475 | 0.000670 | 0.046990 | 4 |
| 1399 | 0.001528 | 0.178058 | 0.015742 | 0.002843 | 0.136290 | 1 |
| 1400 | 0.000000 | 0.000578 | 0.006272 | 0.000000 | 0.006614 | 4 |
| 1401 | 0.000000 | 0.019111 | 0.000211 | 0.001503 | 0.055047 | 4 |
| 1402 | 0.254590 | 0.011227 | 0.008201 | 0.538315 | 0.068575 | 3 |
| 1403 | 0.001493 | 0.022629 | 0.228524 | 0.002678 | 0.067876 | 2 |
| 1404 | 0.002070 | 0.039605 | 0.001513 | 0.002495 | 0.111011 | 4 |
| 1405 | 0.000000 | 0.007000 | 0.007666 | 0.000294 | 0.029135 | 4 |
| 1406 | 0.000283 | 0.013533 | 0.002197 | 0.000000 | 0.065916 | 4 |
| 1407 | 0.002357 | 0.019568 | 0.001769 | 0.004216 | 0.066534 | 4 |
| 1408 | 0.000000 | 0.005240 | 0.001429 | 0.000348 | 0.028647 | 4 |
| 1409 | 0.152201 | 0.009527 | 0.010135 | 0.655843 | 0.091633 | 3 |
| 1410 | 0.001586 | 0.404267 | 0.003121 | 0.002697 | 0.098801 | 1 |
| 1411 | 0.001062 | 0.034376 | 0.002477 | 0.001864 | 0.070570 | 4 |
| 1412 | 0.000160 | 0.016810 | 0.011003 | 0.001777 | 0.056750 | 4 |
| 1413 | 0.002645 | 0.255197 | 0.010720 | 0.006045 | 0.197080 | 1 |
| 1414 | 0.000912 | 0.014045 | 0.000361 | 0.001064 | 0.029949 | 4 |
| 1415 | 0.001322 | 0.131081 | 0.004151 | 0.001068 | 0.088815 | 1 |
| 1416 | 0.001744 | 0.230786 | 0.008839 | 0.005806 | 0.273407 | 4 |
| 1417 | 0.002478 | 0.011111 | 0.012664 | 0.005046 | 0.137115 | 4 |
| 1418 | 0.000083 | 0.004193 | 0.008049 | 0.000000 | 0.025090 | 4 |
| 1419 | 0.000000 | 0.014538 | 0.000185 | 0.000618 | 0.035685 | 4 |
| 1420 | 0.001049 | 0.057764 | 0.000433 | 0.002964 | 0.083386 | 4 |
| 1421 | 0.002129 | 0.084602 | 0.006567 | 0.004752 | 0.136977 | 4 |
| 1422 | 0.000000 | 0.029682 | 0.000810 | 0.001268 | 0.057800 | 4 |
| 1423 | 0.000573 | 0.018486 | 0.001434 | 0.003955 | 0.062245 | 4 |
| 1424 | 0.000997 | 0.031909 | 0.654705 | 0.000582 | 0.058150 | 2 |
| 1425 | 0.000043 | 0.007983 | 0.682277 | 0.000466 | 0.049876 | 2 |
| 1426 | 0.000000 | 0.017096 | 0.000449 | 0.000746 | 0.047194 | 4 |
| 1427 | 0.000465 | 0.003507 | 0.000169 | 0.000000 | 0.021277 | 4 |
| 1428 | 0.000000 | 0.004277 | 0.041445 | 0.002276 | 0.040619 | 2 |
| 1429 | 0.000363 | 0.014714 | 0.002371 | 0.004612 | 0.126780 | 4 |
| 1430 | 0.000092 | 0.022686 | 0.030288 | 0.003464 | 0.084911 | 4 |
| 1431 | 0.000106 | 0.036052 | 0.001322 | 0.001683 | 0.103729 | 4 |
| 1432 | 0.000000 | 0.040538 | 0.000587 | 0.001474 | 0.042402 | 4 |
| 1433 | 0.000734 | 0.035960 | 0.006542 | 0.001496 | 0.101754 | 4 |
| 1434 | 0.000000 | 0.010883 | 0.000168 | 0.000558 | 0.012920 | 4 |
| 1435 | 0.711135 | 0.009839 | 0.004747 | 0.281705 | 0.032491 | 0 |
| 1436 | 0.003006 | 0.825824 | 0.004663 | 0.004479 | 0.097596 | 1 |
| 1437 | 0.047779 | 0.021930 | 0.002894 | 0.227186 | 0.041621 | 3 |
| 1438 | 0.001046 | 0.005971 | 0.001498 | 0.000264 | 0.018274 | 4 |
| 1439 | 0.001615 | 0.009647 | 0.001181 | 0.001009 | 0.033582 | 4 |
| 1440 | 0.001271 | 0.007737 | 0.054995 | 0.002232 | 0.087332 | 4 |
| 1441 | 0.000366 | 0.011308 | 0.005393 | 0.000072 | 0.042384 | 4 |
| 1442 | 0.000185 | 0.030247 | 0.002161 | 0.001193 | 0.067409 | 4 |
| 1443 | 0.000402 | 0.010668 | 0.463454 | 0.000877 | 0.063754 | 2 |
| 1444 | 0.000219 | 0.010115 | 0.062525 | 0.004335 | 0.042311 | 2 |
| 1445 | 0.000000 | 0.010890 | 0.003069 | 0.002141 | 0.030062 | 4 |
| 1446 | 0.000352 | 0.016075 | 0.034257 | 0.002724 | 0.079750 | 4 |
| 1447 | 0.000501 | 0.028333 | 0.002764 | 0.000381 | 0.036956 | 4 |
| 1448 | 0.000575 | 0.009532 | 0.004590 | 0.003180 | 0.036620 | 4 |
| 1449 | 0.000000 | 0.022368 | 0.000674 | 0.001885 | 0.041209 | 4 |
| 1450 | 0.002837 | 0.224240 | 0.010648 | 0.002754 | 0.107867 | 1 |
| 1451 | 0.108837 | 0.036273 | 0.026239 | 0.510672 | 0.132519 | 3 |
| 1452 | 0.000000 | 0.014630 | 0.001016 | 0.000000 | 0.026695 | 4 |
| 1453 | 0.000000 | 0.015858 | 0.000319 | 0.000200 | 0.033886 | 4 |
| 1454 | 0.000065 | 0.014547 | 0.026256 | 0.000903 | 0.086876 | 4 |
| 1455 | 0.000195 | 0.005222 | 0.000263 | 0.000935 | 0.029235 | 4 |
| 1456 | 0.001886 | 0.229327 | 0.135708 | 0.005094 | 0.274607 | 4 |
| 1457 | 0.000786 | 0.182559 | 0.001205 | 0.001937 | 0.054308 | 1 |
| 1458 | 0.000173 | 0.018614 | 0.426095 | 0.000000 | 0.059592 | 2 |
| 1459 | 0.000148 | 0.024516 | 0.000501 | 0.001053 | 0.071147 | 4 |
| 1460 | 0.000000 | 0.014208 | 0.001555 | 0.000170 | 0.032824 | 4 |
| 1461 | 0.007881 | 0.757356 | 0.015902 | 0.024919 | 0.243446 | 1 |
| 1462 | 0.001497 | 0.207320 | 0.038817 | 0.001202 | 0.119073 | 1 |
| 1463 | 0.016278 | 0.059267 | 0.121897 | 0.071953 | 0.199541 | 4 |
| 1464 | 0.000000 | 0.008562 | 0.000591 | 0.000000 | 0.013652 | 4 |
| 1465 | 0.112133 | 0.023091 | 0.001405 | 0.018952 | 0.038288 | 0 |
| 1466 | 0.000000 | 0.001905 | 0.000178 | 0.001044 | 0.013707 | 4 |
| 1467 | 0.000000 | 0.006193 | 0.000000 | 0.000000 | 0.025188 | 4 |
| 1468 | 0.000808 | 0.013556 | 0.003755 | 0.001988 | 0.079574 | 4 |
| 1469 | 0.001744 | 0.225314 | 0.003294 | 0.002943 | 0.099475 | 1 |
| 1470 | 0.001023 | 0.011251 | 0.008331 | 0.002415 | 0.097867 | 4 |
| 1471 | 0.000922 | 0.229480 | 0.003238 | 0.005378 | 0.155556 | 1 |
| 1472 | 0.000128 | 0.014032 | 0.257955 | 0.001210 | 0.067751 | 2 |
| 1473 | 0.000000 | 0.003906 | 0.496460 | 0.000109 | 0.018743 | 2 |
| 1474 | 0.000786 | 0.016916 | 0.027054 | 0.001770 | 0.067151 | 4 |
| 1475 | 0.000794 | 0.020742 | 0.581328 | 0.001559 | 0.112874 | 2 |
| 1476 | 0.000166 | 0.005448 | 0.002722 | 0.000192 | 0.044243 | 4 |
| 1477 | 0.000625 | 0.052757 | 0.004199 | 0.004007 | 0.174956 | 4 |
| 1478 | 0.000000 | 0.009966 | 0.000267 | 0.000000 | 0.027381 | 4 |
| 1479 | 0.256295 | 0.014351 | 0.012033 | 0.870507 | 0.058881 | 3 |
| 1480 | 0.003217 | 0.210145 | 0.010597 | 0.016019 | 0.202273 | 1 |
| 1481 | 0.000154 | 0.007788 | 0.556044 | 0.000527 | 0.033516 | 2 |
| 1482 | 0.000206 | 0.015967 | 0.019691 | 0.001193 | 0.085628 | 4 |
| 1483 | 0.002500 | 0.198185 | 0.052115 | 0.006758 | 0.302948 | 4 |
| 1484 | 0.001784 | 0.033007 | 0.013968 | 0.004742 | 0.194163 | 4 |
| 1485 | 0.000000 | 0.011228 | 0.001956 | 0.000000 | 0.032058 | 4 |
| 1486 | 0.001048 | 0.204422 | 0.003693 | 0.003017 | 0.122303 | 1 |
| 1487 | 0.001374 | 0.016160 | 0.000252 | 0.000502 | 0.030727 | 4 |
| 1488 | 0.000000 | 0.017975 | 0.042296 | 0.001500 | 0.055870 | 4 |
| 1489 | 0.001235 | 0.005413 | 0.007009 | 0.000000 | 0.021656 | 4 |
| 1490 | 0.001052 | 0.040627 | 0.003894 | 0.002748 | 0.173602 | 4 |
| 1491 | 0.000829 | 0.113719 | 0.002451 | 0.003348 | 0.089504 | 1 |
| 1492 | 0.001323 | 0.122203 | 0.003533 | 0.002034 | 0.104637 | 1 |
| 1493 | 0.000000 | 0.004763 | 0.000253 | 0.000210 | 0.021212 | 4 |
| 1494 | 0.000903 | 0.060482 | 0.006563 | 0.003246 | 0.231784 | 4 |
| 1495 | 0.002860 | 0.508928 | 0.003425 | 0.005630 | 0.088457 | 1 |
| 1496 | 0.000000 | 0.010095 | 0.005778 | 0.001150 | 0.025599 | 4 |
| 1497 | 0.001670 | 0.076379 | 0.132869 | 0.006351 | 0.169448 | 4 |
| 1498 | 0.000222 | 0.046363 | 0.031899 | 0.004766 | 0.178958 | 4 |
| 1499 | 0.000000 | 0.026684 | 0.001184 | 0.001797 | 0.056688 | 4 |
| 1500 | 0.000562 | 0.025280 | 0.020632 | 0.001469 | 0.082141 | 4 |
| 1501 | 0.000832 | 0.022056 | 0.016015 | 0.002436 | 0.086896 | 4 |
| 1502 | 0.000226 | 0.038920 | 0.029339 | 0.001815 | 0.090517 | 4 |
| 1503 | 0.000401 | 0.111559 | 0.006387 | 0.005217 | 0.239957 | 4 |
| 1504 | 0.001292 | 0.057313 | 0.003915 | 0.004486 | 0.079608 | 4 |
| 1505 | 0.000660 | 0.113798 | 0.005491 | 0.004377 | 0.204303 | 4 |
| 1506 | 0.001487 | 0.318032 | 0.006928 | 0.006433 | 0.127872 | 1 |
| 1507 | 0.002200 | 0.046051 | 0.002173 | 0.002610 | 0.095462 | 4 |
| 1508 | 0.027228 | 0.005670 | 0.001743 | 0.126240 | 0.032469 | 3 |
| 1509 | 0.001158 | 0.192545 | 0.001099 | 0.003065 | 0.044025 | 1 |
| 1510 | 0.000159 | 0.007025 | 0.002997 | 0.000000 | 0.024429 | 4 |
| 1511 | 0.000480 | 0.043858 | 0.000272 | 0.002639 | 0.064142 | 4 |
| 1512 | 0.001101 | 0.148894 | 0.004686 | 0.001485 | 0.039542 | 1 |
| 1513 | 0.000376 | 0.026470 | 0.000316 | 0.001416 | 0.037687 | 4 |
| 1514 | 0.002003 | 0.552615 | 0.005201 | 0.002472 | 0.130681 | 1 |
| 1515 | 0.000375 | 0.049099 | 0.004741 | 0.001418 | 0.160347 | 4 |
| 1516 | 0.000479 | 0.016088 | 0.000358 | 0.001705 | 0.039769 | 4 |
| 1517 | 0.001153 | 0.149971 | 0.000797 | 0.001555 | 0.037748 | 1 |
| 1518 | 0.000308 | 0.014788 | 0.002133 | 0.000125 | 0.056969 | 4 |
| 1519 | 0.001099 | 0.031539 | 0.003836 | 0.000886 | 0.101726 | 4 |
| 1520 | 0.002293 | 0.117503 | 0.077698 | 0.007582 | 0.287985 | 4 |
| 1521 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 1522 | 0.000941 | 0.105813 | 0.002394 | 0.002427 | 0.176773 | 4 |
| 1523 | 0.030529 | 0.005771 | 0.001253 | 0.092700 | 0.022810 | 3 |
| 1524 | 0.000071 | 0.014146 | 0.000209 | 0.000000 | 0.031801 | 4 |
| 1525 | 0.000700 | 0.013688 | 0.000827 | 0.001080 | 0.034144 | 4 |
| 1526 | 0.000000 | 0.006027 | 0.003322 | 0.000579 | 0.028954 | 4 |
| 1527 | 0.000000 | 0.012288 | 0.000142 | 0.002768 | 0.031916 | 4 |
| 1528 | 0.000397 | 0.012056 | 0.008052 | 0.000220 | 0.048770 | 4 |
| 1529 | 0.001467 | 0.151728 | 0.011122 | 0.003220 | 0.132523 | 1 |
| 1530 | 0.001176 | 0.135756 | 0.003898 | 0.001027 | 0.079506 | 1 |
| 1531 | 0.000000 | 0.001576 | 0.035195 | 0.000136 | 0.017355 | 2 |
| 1532 | 0.309214 | 0.011510 | 0.012313 | 0.911013 | 0.067505 | 3 |
| 1533 | 0.000000 | 0.008125 | 0.253791 | 0.000350 | 0.054004 | 2 |
| 1534 | 0.000603 | 0.013202 | 0.005585 | 0.003934 | 0.059220 | 4 |
| 1535 | 0.000376 | 0.055595 | 0.011524 | 0.002191 | 0.125278 | 4 |
| 1536 | 0.000393 | 0.061019 | 0.006860 | 0.005845 | 0.228831 | 4 |
| 1537 | 0.001730 | 0.182010 | 0.005215 | 0.001946 | 0.070203 | 1 |
| 1538 | 0.000505 | 0.019029 | 0.004537 | 0.006318 | 0.164967 | 4 |
| 1539 | 0.000489 | 0.112509 | 0.052628 | 0.001814 | 0.134888 | 4 |
| 1540 | 0.001036 | 0.018377 | 0.005754 | 0.001546 | 0.039629 | 4 |
| 1541 | 0.000000 | 0.012692 | 0.001178 | 0.000167 | 0.035856 | 4 |
| 1542 | 0.022476 | 0.014575 | 0.003615 | 0.104108 | 0.052391 | 3 |
| 1543 | 0.000469 | 0.138862 | 0.007582 | 0.003606 | 0.179276 | 4 |
| 1544 | 0.001209 | 0.229191 | 0.001444 | 0.002984 | 0.080447 | 1 |
| 1545 | 0.000000 | 0.004450 | 0.006655 | 0.000642 | 0.048017 | 4 |
| 1546 | 0.001090 | 0.299429 | 0.022804 | 0.004440 | 0.121082 | 1 |
| 1547 | 0.000320 | 0.027721 | 0.002571 | 0.001511 | 0.099015 | 4 |
| 1548 | 0.004093 | 0.653060 | 0.007584 | 0.010849 | 0.194259 | 1 |
| 1549 | 0.001977 | 0.116248 | 0.297288 | 0.005501 | 0.239191 | 2 |
| 1550 | 0.004407 | 0.380207 | 0.011853 | 0.013604 | 0.158846 | 1 |
| 1551 | 0.001049 | 0.087852 | 0.008110 | 0.004141 | 0.182696 | 4 |
| 1552 | 0.000066 | 0.032444 | 0.024616 | 0.001949 | 0.090037 | 4 |
| 1553 | 0.000600 | 0.008530 | 0.000700 | 0.002663 | 0.027468 | 4 |
| 1554 | 0.000000 | 0.007224 | 0.001963 | 0.000920 | 0.027373 | 4 |
| 1555 | 0.000161 | 0.091318 | 0.001979 | 0.000852 | 0.068144 | 1 |
| 1556 | 0.004887 | 0.653351 | 0.013528 | 0.011447 | 0.315431 | 1 |
| 1557 | 0.001264 | 0.223662 | 0.012944 | 0.002943 | 0.148099 | 1 |
| 1558 | 0.000718 | 0.020575 | 0.135502 | 0.001820 | 0.128052 | 2 |
| 1559 | 0.000696 | 0.058015 | 0.003376 | 0.001586 | 0.114511 | 4 |
| 1560 | 0.001778 | 0.220342 | 0.065926 | 0.006041 | 0.273633 | 4 |
| 1561 | 0.001583 | 0.418330 | 0.003383 | 0.001126 | 0.070762 | 1 |
| 1562 | 0.000857 | 0.106682 | 0.013413 | 0.002982 | 0.128207 | 4 |
| 1563 | 0.000380 | 0.034492 | 0.001958 | 0.001346 | 0.083952 | 4 |
| 1564 | 0.000448 | 0.059703 | 0.004443 | 0.002578 | 0.161486 | 4 |
| 1565 | 0.001350 | 0.069223 | 0.004887 | 0.002526 | 0.118684 | 4 |
| 1566 | 0.000745 | 0.247542 | 0.006529 | 0.002307 | 0.166956 | 1 |
| 1567 | 0.000688 | 0.060118 | 0.000944 | 0.001223 | 0.081376 | 4 |
| 1568 | 0.000948 | 0.008020 | 0.010026 | 0.002241 | 0.020047 | 4 |
| 1569 | 0.000000 | 0.009482 | 0.000000 | 0.000000 | 0.022076 | 4 |
| 1570 | 0.000364 | 0.016000 | 0.002091 | 0.004218 | 0.113381 | 4 |
| 1571 | 0.000175 | 0.009852 | 0.007497 | 0.001508 | 0.051931 | 4 |
| 1572 | 0.169037 | 0.032149 | 0.005619 | 0.400611 | 0.053248 | 3 |
| 1573 | 0.000000 | 0.016360 | 0.000825 | 0.001149 | 0.063761 | 4 |
| 1574 | 0.000000 | 0.030468 | 0.002746 | 0.001657 | 0.105172 | 4 |
| 1575 | 0.000048 | 0.005769 | 0.002047 | 0.000720 | 0.042437 | 4 |
| 1576 | 0.000163 | 0.012343 | 0.054669 | 0.000674 | 0.072789 | 4 |
| 1577 | 0.000238 | 0.013247 | 0.004051 | 0.000000 | 0.054556 | 4 |
| 1578 | 0.000385 | 0.020481 | 0.000343 | 0.000404 | 0.032222 | 4 |
| 1579 | 0.028939 | 0.019164 | 0.001982 | 0.136465 | 0.060975 | 3 |
| 1580 | 0.000345 | 0.009360 | 0.503884 | 0.001532 | 0.058731 | 2 |
| 1581 | 0.000000 | 0.011212 | 0.001391 | 0.000000 | 0.030468 | 4 |
| 1582 | 0.000497 | 0.019476 | 0.003168 | 0.000000 | 0.038733 | 4 |
| 1583 | 0.498183 | 0.015742 | 0.009267 | 0.553797 | 0.077330 | 3 |
| 1584 | 0.004631 | 0.015887 | 0.016778 | 0.007066 | 0.189399 | 4 |
| 1585 | 0.824347 | 0.005423 | 0.002430 | 0.103364 | 0.017407 | 0 |
| 1586 | 0.001717 | 0.010858 | 0.012081 | 0.003895 | 0.098394 | 4 |
| 1587 | 0.000000 | 0.016580 | 0.000855 | 0.000000 | 0.044420 | 4 |
| 1588 | 0.002433 | 0.009408 | 0.013089 | 0.004496 | 0.138384 | 4 |
| 1589 | 0.023713 | 0.004755 | 0.001665 | 0.113270 | 0.015624 | 3 |
| 1590 | 0.000432 | 0.097153 | 0.007208 | 0.003497 | 0.080775 | 1 |
| 1591 | 0.000000 | 0.003710 | 0.005853 | 0.000000 | 0.023430 | 4 |
| 1592 | 0.000364 | 0.009662 | 0.001246 | 0.000443 | 0.035743 | 4 |
| 1593 | 0.000000 | 0.007511 | 0.000140 | 0.000161 | 0.018501 | 4 |
| 1594 | 0.000000 | 0.013701 | 0.000423 | 0.000488 | 0.053872 | 4 |
| 1595 | 0.002456 | 0.505836 | 0.002930 | 0.006661 | 0.082970 | 1 |
| 1596 | 0.000000 | 0.006633 | 0.064642 | 0.000745 | 0.046461 | 2 |
| 1597 | 0.000270 | 0.032838 | 0.170703 | 0.001279 | 0.137869 | 2 |
| 1598 | 0.000602 | 0.021305 | 0.000123 | 0.000236 | 0.040964 | 4 |
| 1599 | 0.595812 | 0.003753 | 0.003603 | 0.251894 | 0.114491 | 0 |
| 1600 | 0.000111 | 0.006031 | 0.007209 | 0.000565 | 0.056858 | 4 |
| 1601 | 0.038980 | 0.001632 | 0.001100 | 0.103951 | 0.133749 | 4 |
| 1602 | 0.000203 | 0.009249 | 0.009788 | 0.001451 | 0.041983 | 4 |
| 1603 | 0.005068 | 0.009027 | 0.018549 | 0.007709 | 0.191852 | 4 |
| 1604 | 0.002241 | 0.264868 | 0.005376 | 0.005884 | 0.145596 | 1 |
| 1605 | 0.000000 | 0.005326 | 0.071148 | 0.001689 | 0.057249 | 2 |
| 1606 | 0.743545 | 0.001403 | 0.004908 | 0.174256 | 0.023662 | 0 |
| 1607 | 0.000998 | 0.222730 | 0.059090 | 0.002749 | 0.106398 | 1 |
| 1608 | 0.000779 | 0.118548 | 0.005582 | 0.002606 | 0.154368 | 4 |
| 1609 | 0.000384 | 0.016665 | 0.000329 | 0.000000 | 0.041046 | 4 |
| 1610 | 0.000000 | 0.013819 | 0.000620 | 0.000878 | 0.048939 | 4 |
| 1611 | 0.001479 | 0.144883 | 0.007223 | 0.003852 | 0.152311 | 4 |
| 1612 | 0.000334 | 0.036878 | 0.007210 | 0.004496 | 0.094589 | 4 |
| 1613 | 0.000276 | 0.010532 | 0.002379 | 0.001544 | 0.027856 | 4 |
| 1614 | 0.000902 | 0.030061 | 0.003381 | 0.002544 | 0.123707 | 4 |
| 1615 | 0.510141 | 0.012771 | 0.009652 | 0.741592 | 0.050665 | 3 |
| 1616 | 0.000279 | 0.040002 | 0.001242 | 0.002204 | 0.066967 | 4 |
| 1617 | 0.163695 | 0.019810 | 0.010081 | 0.666222 | 0.108885 | 3 |
| 1618 | 0.768390 | 0.002183 | 0.004183 | 0.255019 | 0.025031 | 0 |
| 1619 | 0.002113 | 0.483997 | 0.004862 | 0.006072 | 0.126028 | 1 |
| 1620 | 0.000000 | 0.008932 | 0.013773 | 0.002311 | 0.038368 | 4 |
| 1621 | 0.002317 | 0.011859 | 0.068368 | 0.004309 | 0.120853 | 4 |
| 1622 | 0.000000 | 0.024626 | 0.001014 | 0.002566 | 0.056616 | 4 |
| 1623 | 0.000000 | 0.013244 | 0.019164 | 0.001590 | 0.058177 | 4 |
| 1624 | 0.000575 | 0.059778 | 0.020268 | 0.004352 | 0.207771 | 4 |
| 1625 | 0.001559 | 0.255735 | 0.008991 | 0.002959 | 0.144560 | 1 |
| 1626 | 0.001372 | 0.140777 | 0.037677 | 0.004922 | 0.111969 | 1 |
| 1627 | 0.000341 | 0.013413 | 0.000873 | 0.003357 | 0.077037 | 4 |
| 1628 | 0.001158 | 0.025396 | 0.161890 | 0.001909 | 0.116938 | 2 |
| 1629 | 0.000000 | 0.001492 | 0.001705 | 0.000513 | 0.022451 | 4 |
| 1630 | 0.000180 | 0.011362 | 0.000889 | 0.000384 | 0.048084 | 4 |
| 1631 | 0.001150 | 0.283088 | 0.005372 | 0.000837 | 0.080839 | 1 |
| 1632 | 0.011037 | 0.024178 | 0.004826 | 0.054486 | 0.092613 | 4 |
| 1633 | 0.000000 | 0.015010 | 0.000299 | 0.000088 | 0.026525 | 4 |
| 1634 | 0.000000 | 0.014813 | 0.000621 | 0.000000 | 0.023241 | 4 |
| 1635 | 0.513959 | 0.011829 | 0.007117 | 0.167721 | 0.037393 | 0 |
| 1636 | 0.000697 | 0.019153 | 0.034409 | 0.002118 | 0.076131 | 4 |
| 1637 | 0.002457 | 0.413465 | 0.004332 | 0.006279 | 0.152687 | 1 |
| 1638 | 0.350319 | 0.000883 | 0.005412 | 0.188749 | 0.065587 | 0 |
| 1639 | 0.001040 | 0.118953 | 0.016796 | 0.003584 | 0.173019 | 4 |
| 1640 | 0.000254 | 0.025785 | 0.000530 | 0.000000 | 0.040255 | 4 |
| 1641 | 0.000488 | 0.031447 | 0.002996 | 0.001624 | 0.086126 | 4 |
| 1642 | 0.076316 | 0.025155 | 0.007533 | 0.353625 | 0.089492 | 3 |
| 1643 | 0.001352 | 0.042348 | 0.276399 | 0.006735 | 0.068761 | 2 |
| 1644 | 0.109737 | 0.030564 | 0.007173 | 0.503921 | 0.101928 | 3 |
| 1645 | 0.775193 | 0.000782 | 0.002470 | 0.111136 | 0.015560 | 0 |
| 1646 | 0.000707 | 0.010767 | 0.012502 | 0.002024 | 0.070278 | 4 |
| 1647 | 0.000120 | 0.017937 | 0.001730 | 0.001693 | 0.068124 | 4 |
| 1648 | 0.000000 | 0.018509 | 0.000647 | 0.001246 | 0.040523 | 4 |
| 1649 | 0.000080 | 0.031505 | 0.002879 | 0.000853 | 0.042957 | 4 |
| 1650 | 0.000058 | 0.015813 | 0.032941 | 0.000826 | 0.063416 | 4 |
| 1651 | 0.201790 | 0.000287 | 0.004010 | 0.143721 | 0.061232 | 0 |
| 1652 | 0.000093 | 0.009256 | 0.024816 | 0.002003 | 0.076575 | 4 |
| 1653 | 0.000000 | 0.005656 | 0.009517 | 0.000602 | 0.037998 | 4 |
| 1654 | 0.000925 | 0.091796 | 0.001796 | 0.001079 | 0.081138 | 1 |
| 1655 | 0.000000 | 0.042645 | 0.000961 | 0.001865 | 0.080728 | 4 |
| 1656 | 0.001828 | 0.022808 | 0.000481 | 0.000928 | 0.036673 | 4 |
| 1657 | 0.001928 | 0.008945 | 0.000406 | 0.001905 | 0.045830 | 4 |
| 1658 | 0.000638 | 0.017328 | 0.018681 | 0.001471 | 0.090882 | 4 |
| 1659 | 0.054300 | 0.069855 | 0.081418 | 0.245383 | 0.212830 | 3 |
| 1660 | 0.000000 | 0.008766 | 0.004165 | 0.000419 | 0.040790 | 4 |
| 1661 | 0.000000 | 0.013473 | 0.001848 | 0.000962 | 0.027367 | 4 |
| 1662 | 0.001670 | 0.013237 | 0.002325 | 0.005092 | 0.053169 | 4 |
| 1663 | 0.000000 | 0.004492 | 0.000523 | 0.000000 | 0.024600 | 4 |
| 1664 | 0.001037 | 0.033274 | 0.031469 | 0.001946 | 0.104495 | 4 |
| 1665 | 0.006139 | 0.039795 | 0.075872 | 0.026851 | 0.217007 | 4 |
| 1666 | 0.000000 | 0.004456 | 0.000087 | 0.000577 | 0.020050 | 4 |
| 1667 | 0.000155 | 0.020687 | 0.001548 | 0.000629 | 0.063501 | 4 |
| 1668 | 0.000385 | 0.034002 | 0.002254 | 0.000542 | 0.064152 | 4 |
| 1669 | 0.001569 | 0.007977 | 0.007456 | 0.003466 | 0.090351 | 4 |
| 1670 | 0.000000 | 0.004612 | 0.000557 | 0.000000 | 0.019791 | 4 |
| 1671 | 0.000242 | 0.029800 | 0.000731 | 0.001247 | 0.077900 | 4 |
| 1672 | 0.000947 | 0.044952 | 0.000922 | 0.001671 | 0.057996 | 4 |
| 1673 | 0.017805 | 0.221040 | 0.004348 | 0.006473 | 0.188688 | 1 |
| 1674 | 0.001723 | 0.008638 | 0.196036 | 0.002927 | 0.092266 | 2 |
| 1675 | 0.000000 | 0.008701 | 0.000000 | 0.002509 | 0.015127 | 4 |
| 1676 | 0.000731 | 0.035550 | 0.010703 | 0.003283 | 0.111115 | 4 |
| 1677 | 0.001062 | 0.064599 | 0.006932 | 0.003628 | 0.119050 | 4 |
| 1678 | 0.000000 | 0.007251 | 0.001958 | 0.001271 | 0.025785 | 4 |
| 1679 | 0.000000 | 0.012019 | 0.002875 | 0.001441 | 0.075732 | 4 |
| 1680 | 0.000000 | 0.013258 | 0.037287 | 0.000213 | 0.045468 | 4 |
| 1681 | 0.001637 | 0.223827 | 0.005412 | 0.004149 | 0.166318 | 1 |
| 1682 | 0.000563 | 0.009287 | 0.007273 | 0.001097 | 0.046659 | 4 |
| 1683 | 0.000000 | 0.007162 | 0.000205 | 0.000692 | 0.024727 | 4 |
| 1684 | 0.000555 | 0.026806 | 0.000991 | 0.003087 | 0.055969 | 4 |
| 1685 | 0.000063 | 0.013189 | 0.043115 | 0.001159 | 0.045346 | 4 |
| 1686 | 0.000000 | 0.005746 | 0.000000 | 0.000000 | 0.012378 | 4 |
| 1687 | 0.001247 | 0.031752 | 0.000966 | 0.001252 | 0.078080 | 4 |
| 1688 | 0.000293 | 0.011626 | 0.000956 | 0.000274 | 0.038347 | 4 |
| 1689 | 0.080499 | 0.004530 | 0.003422 | 0.282924 | 0.135037 | 3 |
| 1690 | 0.001882 | 0.181478 | 0.010184 | 0.001287 | 0.079966 | 1 |
| 1691 | 0.000177 | 0.011566 | 0.002396 | 0.000000 | 0.030624 | 4 |
| 1692 | 0.000879 | 0.024999 | 0.002626 | 0.000446 | 0.087334 | 4 |
| 1693 | 0.003262 | 0.216700 | 0.017868 | 0.006615 | 0.162596 | 1 |
| 1694 | 0.000785 | 0.023630 | 0.126291 | 0.001029 | 0.143010 | 4 |
| 1695 | 0.001027 | 0.159785 | 0.005849 | 0.002177 | 0.136256 | 1 |
| 1696 | 0.001236 | 0.165715 | 0.004625 | 0.003473 | 0.114483 | 1 |
| 1697 | 0.000000 | 0.003755 | 0.005969 | 0.001754 | 0.028341 | 4 |
| 1698 | 0.001410 | 0.358145 | 0.004059 | 0.004857 | 0.140468 | 1 |
| 1699 | 0.001306 | 0.060074 | 0.006802 | 0.003840 | 0.238609 | 4 |
| 1700 | 0.000000 | 0.015186 | 0.000160 | 0.001070 | 0.028080 | 4 |
| 1701 | 0.000000 | 0.010241 | 0.000000 | 0.000000 | 0.030831 | 4 |
| 1702 | 0.000281 | 0.014360 | 0.569282 | 0.000683 | 0.068222 | 2 |
| 1703 | 0.004224 | 0.058717 | 0.002272 | 0.002061 | 0.142934 | 4 |
| 1704 | 0.000000 | 0.023384 | 0.003680 | 0.001878 | 0.110952 | 4 |
| 1705 | 0.000444 | 0.011720 | 0.001446 | 0.000356 | 0.035684 | 4 |
| 1706 | 0.016614 | 0.039792 | 0.004940 | 0.058780 | 0.042676 | 3 |
| 1707 | 0.000000 | 0.001850 | 0.506936 | 0.000000 | 0.012820 | 2 |
| 1708 | 0.001001 | 0.343000 | 0.002992 | 0.000826 | 0.084516 | 1 |
| 1709 | 0.173745 | 0.011127 | 0.005858 | 0.378648 | 0.043239 | 3 |
| 1710 | 0.000292 | 0.006771 | 0.000438 | 0.000197 | 0.019345 | 4 |
| 1711 | 0.000691 | 0.070555 | 0.003583 | 0.002554 | 0.209521 | 4 |
| 1712 | 0.000000 | 0.012109 | 0.001488 | 0.000569 | 0.041727 | 4 |
| 1713 | 0.001292 | 0.097281 | 0.052524 | 0.003768 | 0.216515 | 4 |
| 1714 | 0.000617 | 0.024469 | 0.007631 | 0.003999 | 0.113872 | 4 |
| 1715 | 0.000405 | 0.086768 | 0.003680 | 0.002661 | 0.108063 | 4 |
| 1716 | 0.001744 | 0.457636 | 0.006101 | 0.004103 | 0.180226 | 1 |
| 1717 | 0.000000 | 0.020939 | 0.000410 | 0.000741 | 0.027360 | 4 |
| 1718 | 0.001462 | 0.275300 | 0.004036 | 0.003130 | 0.136945 | 1 |
| 1719 | 0.000931 | 0.101389 | 0.002697 | 0.001712 | 0.104341 | 4 |
| 1720 | 0.001212 | 0.021109 | 0.008004 | 0.003075 | 0.118204 | 4 |
| 1721 | 0.000024 | 0.007232 | 0.001659 | 0.000177 | 0.029293 | 4 |
| 1722 | 0.001210 | 0.046431 | 0.016881 | 0.003186 | 0.130091 | 4 |
| 1723 | 0.001816 | 0.154881 | 0.004357 | 0.001957 | 0.043884 | 1 |
| 1724 | 0.001074 | 0.038593 | 0.001569 | 0.001078 | 0.066265 | 4 |
| 1725 | 0.000231 | 0.011371 | 0.058541 | 0.003618 | 0.055148 | 2 |
| 1726 | 0.000964 | 0.074636 | 0.002501 | 0.002508 | 0.147593 | 4 |
| 1727 | 0.000361 | 0.015400 | 0.000941 | 0.000453 | 0.061191 | 4 |
| 1728 | 0.001090 | 0.079144 | 0.004918 | 0.003160 | 0.139107 | 4 |
| 1729 | 0.000202 | 0.078737 | 0.008330 | 0.005332 | 0.277641 | 4 |
| 1730 | 0.000290 | 0.018557 | 0.000825 | 0.001020 | 0.071801 | 4 |
| 1731 | 0.001286 | 0.121889 | 0.006782 | 0.003203 | 0.121165 | 1 |
| 1732 | 0.000000 | 0.011710 | 0.000775 | 0.000567 | 0.035036 | 4 |
| 1733 | 0.000190 | 0.040512 | 0.002630 | 0.001399 | 0.075286 | 4 |
| 1734 | 0.000178 | 0.021808 | 0.004761 | 0.001260 | 0.064142 | 4 |
| 1735 | 0.000000 | 0.003173 | 0.000269 | 0.000000 | 0.024196 | 4 |
| 1736 | 0.002347 | 0.256401 | 0.001728 | 0.002770 | 0.064518 | 1 |
| 1737 | 0.000000 | 0.016241 | 0.000637 | 0.000683 | 0.035232 | 4 |
| 1738 | 0.409271 | 0.013596 | 0.008497 | 0.615524 | 0.056041 | 3 |
| 1739 | 0.003012 | 0.628473 | 0.006225 | 0.005801 | 0.163519 | 1 |
| 1740 | 0.896723 | 0.003995 | 0.002266 | 0.110594 | 0.018018 | 0 |
| 1741 | 0.000453 | 0.006404 | 0.424966 | 0.002862 | 0.065920 | 2 |
| 1742 | 0.418914 | 0.007641 | 0.010943 | 0.612195 | 0.049061 | 3 |
| 1743 | 0.001081 | 0.024772 | 0.011026 | 0.003341 | 0.105893 | 4 |
| 1744 | 0.273327 | 0.013634 | 0.012613 | 0.964253 | 0.078249 | 3 |
| 1745 | 0.000000 | 0.047598 | 0.000843 | 0.000273 | 0.077391 | 4 |
| 1746 | 0.000535 | 0.024367 | 0.014535 | 0.004064 | 0.074310 | 4 |
| 1747 | 0.001774 | 0.126904 | 0.005791 | 0.004532 | 0.126336 | 1 |
| 1748 | 0.000503 | 0.023499 | 0.008143 | 0.002731 | 0.086140 | 4 |
| 1749 | 0.000000 | 0.001345 | 0.009167 | 0.001215 | 0.032834 | 4 |
| 1750 | 0.001485 | 0.287523 | 0.002667 | 0.003717 | 0.122095 | 1 |
| 1751 | 0.001397 | 0.056026 | 0.010139 | 0.002059 | 0.138059 | 4 |
| 1752 | 0.000730 | 0.240165 | 0.005268 | 0.000880 | 0.045323 | 1 |
| 1753 | 0.000841 | 0.073658 | 0.004351 | 0.003340 | 0.195698 | 4 |
| 1754 | 0.000743 | 0.130387 | 0.019405 | 0.003291 | 0.132023 | 4 |
| 1755 | 0.000000 | 0.004858 | 0.001540 | 0.000145 | 0.025620 | 4 |
| 1756 | 0.000930 | 0.070022 | 0.009038 | 0.005197 | 0.189889 | 4 |
| 1757 | 0.018482 | 0.457231 | 0.007107 | 0.011392 | 0.161888 | 1 |
| 1758 | 0.000202 | 0.013386 | 0.003023 | 0.004967 | 0.152956 | 4 |
| 1759 | 0.000229 | 0.042479 | 0.000475 | 0.000715 | 0.070855 | 4 |
| 1760 | 0.000172 | 0.019093 | 0.006403 | 0.001721 | 0.052622 | 4 |
| 1761 | 0.000902 | 0.034651 | 0.004595 | 0.003464 | 0.107479 | 4 |
| 1762 | 0.002525 | 0.009991 | 0.075308 | 0.004480 | 0.126378 | 4 |
| 1763 | 0.000000 | 0.029290 | 0.011767 | 0.000274 | 0.043604 | 4 |
| 1764 | 0.000363 | 0.009276 | 0.000722 | 0.000271 | 0.043928 | 4 |
| 1765 | 0.000248 | 0.010039 | 0.000171 | 0.001384 | 0.013051 | 4 |
| 1766 | 0.000655 | 0.074866 | 0.003214 | 0.003407 | 0.169174 | 4 |
| 1767 | 0.000985 | 0.027636 | 0.001212 | 0.001519 | 0.053517 | 4 |
| 1768 | 0.000110 | 0.022502 | 0.001745 | 0.001752 | 0.059122 | 4 |
| 1769 | 0.618283 | 0.001969 | 0.006037 | 0.198199 | 0.034133 | 0 |
| 1770 | 0.000000 | 0.008261 | 0.000318 | 0.001232 | 0.030118 | 4 |
| 1771 | 0.000000 | 0.022160 | 0.000638 | 0.000352 | 0.043392 | 4 |
| 1772 | 0.002827 | 0.739630 | 0.003963 | 0.005772 | 0.112424 | 1 |
| 1773 | 0.001230 | 0.338811 | 0.002829 | 0.002772 | 0.123594 | 1 |
| 1774 | 0.000189 | 0.042653 | 0.005350 | 0.001917 | 0.130622 | 4 |
| 1775 | 0.000678 | 0.021060 | 0.013600 | 0.000735 | 0.081429 | 4 |
| 1776 | 0.653020 | 0.007107 | 0.007789 | 0.365273 | 0.047076 | 0 |
| 1777 | 0.000238 | 0.017364 | 0.026278 | 0.001994 | 0.089992 | 4 |
| 1778 | 0.001051 | 0.023207 | 0.000284 | 0.000225 | 0.039212 | 4 |
| 1779 | 0.000000 | 0.002299 | 0.000340 | 0.001881 | 0.012838 | 4 |
| 1780 | 0.000000 | 0.005307 | 0.001819 | 0.000239 | 0.031300 | 4 |
| 1781 | 0.000526 | 0.007212 | 0.028877 | 0.001770 | 0.086547 | 4 |
| 1782 | 0.298598 | 0.006733 | 0.006033 | 0.394805 | 0.162844 | 3 |
| 1783 | 0.000000 | 0.010925 | 0.000744 | 0.001503 | 0.049192 | 4 |
| 1784 | 0.000000 | 0.013780 | 0.001054 | 0.000468 | 0.031654 | 4 |
| 1785 | 0.000655 | 0.013138 | 0.008851 | 0.000150 | 0.050531 | 4 |
| 1786 | 0.000000 | 0.010625 | 0.000123 | 0.000000 | 0.033382 | 4 |
| 1787 | 0.002380 | 0.011387 | 0.047403 | 0.004061 | 0.094804 | 4 |
| 1788 | 0.000061 | 0.009052 | 0.001969 | 0.000496 | 0.047948 | 4 |
| 1789 | 0.000000 | 0.006050 | 0.004138 | 0.000878 | 0.018902 | 4 |
| 1790 | 0.001254 | 0.039428 | 0.009585 | 0.001980 | 0.133977 | 4 |
| 1791 | 0.002229 | 0.239678 | 0.006844 | 0.004546 | 0.160750 | 1 |
| 1792 | 0.000692 | 0.133449 | 0.001701 | 0.002550 | 0.050513 | 1 |
| 1793 | 0.000000 | 0.005689 | 0.025976 | 0.001487 | 0.071423 | 4 |
| 1794 | 0.001980 | 0.128547 | 0.019577 | 0.005094 | 0.292629 | 4 |
| 1795 | 0.001458 | 0.241800 | 0.003591 | 0.004115 | 0.132239 | 1 |
| 1796 | 0.000000 | 0.005065 | 0.001375 | 0.000086 | 0.032049 | 4 |
| 1797 | 0.001027 | 0.017521 | 0.006366 | 0.002176 | 0.085745 | 4 |
| 1798 | 0.000000 | 0.002409 | 0.564862 | 0.000000 | 0.019616 | 2 |
| 1799 | 0.000000 | 0.008140 | 0.000136 | 0.000368 | 0.041560 | 4 |
| 1800 | 0.001538 | 0.041465 | 0.004886 | 0.001134 | 0.137249 | 4 |
| 1801 | 0.002220 | 0.018577 | 0.027595 | 0.002105 | 0.085671 | 4 |
| 1802 | 0.002915 | 0.517404 | 0.009795 | 0.007009 | 0.276764 | 1 |
| 1803 | 0.000284 | 0.008976 | 0.001167 | 0.000000 | 0.038558 | 4 |
| 1804 | 0.012650 | 0.000396 | 0.000000 | 0.024931 | 0.098998 | 4 |
| 1805 | 0.000000 | 0.027346 | 0.010546 | 0.000666 | 0.061760 | 4 |
| 1806 | 0.002571 | 0.418586 | 0.005517 | 0.004375 | 0.117765 | 1 |
| 1807 | 0.000302 | 0.028679 | 0.002975 | 0.003656 | 0.137186 | 4 |
| 1808 | 0.000859 | 0.011544 | 0.006668 | 0.001457 | 0.055822 | 4 |
| 1809 | 0.000854 | 0.058463 | 0.015534 | 0.002972 | 0.129730 | 4 |
| 1810 | 0.000000 | 0.043213 | 0.001078 | 0.001780 | 0.071917 | 4 |
| 1811 | 0.372594 | 0.016199 | 0.012150 | 0.932887 | 0.065358 | 3 |
| 1812 | 0.000873 | 0.042751 | 0.003632 | 0.000444 | 0.112342 | 4 |
| 1813 | 0.000140 | 0.015931 | 0.000685 | 0.001345 | 0.042391 | 4 |
| 1814 | 0.000000 | 0.011785 | 0.001035 | 0.000720 | 0.036189 | 4 |
| 1815 | 0.000183 | 0.003929 | 0.001384 | 0.000603 | 0.011563 | 4 |
| 1816 | 0.000000 | 0.005571 | 0.004653 | 0.000282 | 0.026415 | 4 |
| 1817 | 0.001204 | 0.015632 | 0.047524 | 0.003839 | 0.140000 | 4 |
| 1818 | 0.000157 | 0.021465 | 0.000282 | 0.000509 | 0.032233 | 4 |
| 1819 | 0.004008 | 0.339172 | 0.006875 | 0.004121 | 0.140828 | 1 |
| 1820 | 0.000000 | 0.006465 | 0.005415 | 0.000323 | 0.031259 | 4 |
| 1821 | 0.000794 | 0.020215 | 0.000494 | 0.000807 | 0.038817 | 4 |
| 1822 | 0.000000 | 0.008325 | 0.001695 | 0.000136 | 0.031014 | 4 |
| 1823 | 0.000000 | 0.016105 | 0.000784 | 0.000563 | 0.031621 | 4 |
| 1824 | 0.000725 | 0.089606 | 0.004024 | 0.003715 | 0.127792 | 4 |
| 1825 | 0.000550 | 0.189584 | 0.009689 | 0.006143 | 0.326373 | 4 |
| 1826 | 0.000251 | 0.027911 | 0.001461 | 0.001256 | 0.096978 | 4 |
| 1827 | 0.000575 | 0.016946 | 0.001346 | 0.001781 | 0.039065 | 4 |
| 1828 | 0.000000 | 0.004105 | 0.042110 | 0.000128 | 0.026197 | 2 |
| 1829 | 0.000000 | 0.007864 | 0.000174 | 0.000850 | 0.020302 | 4 |
| 1830 | 0.001308 | 0.295011 | 0.003454 | 0.002363 | 0.129576 | 1 |
| 1831 | 0.832113 | 0.003016 | 0.001660 | 0.088947 | 0.014335 | 0 |
| 1832 | 0.001324 | 0.014179 | 0.002920 | 0.003135 | 0.080980 | 4 |
| 1833 | 0.000573 | 0.026217 | 0.003264 | 0.001277 | 0.069132 | 4 |
| 1834 | 0.000976 | 0.145485 | 0.009812 | 0.002960 | 0.078804 | 1 |
| 1835 | 0.002350 | 0.027492 | 0.003976 | 0.000932 | 0.074353 | 4 |
| 1836 | 0.000179 | 0.023479 | 0.003291 | 0.004517 | 0.135118 | 4 |
| 1837 | 0.000156 | 0.016875 | 0.009978 | 0.000761 | 0.048904 | 4 |
| 1838 | 0.000115 | 0.014252 | 0.003858 | 0.001378 | 0.068396 | 4 |
| 1839 | 0.000577 | 0.005096 | 0.000782 | 0.002392 | 0.026489 | 4 |
| 1840 | 0.003843 | 0.731886 | 0.004151 | 0.007431 | 0.099123 | 1 |
| 1841 | 0.000269 | 0.046232 | 0.003821 | 0.003662 | 0.142229 | 4 |
| 1842 | 0.000000 | 0.003780 | 0.015388 | 0.000829 | 0.039772 | 4 |
| 1843 | 0.000436 | 0.006970 | 0.282489 | 0.002708 | 0.072202 | 2 |
| 1844 | 0.000280 | 0.007601 | 0.000530 | 0.001324 | 0.030094 | 4 |
| 1845 | 0.002363 | 0.056881 | 0.002612 | 0.003812 | 0.121852 | 4 |
| 1846 | 0.000312 | 0.015561 | 0.003159 | 0.001756 | 0.042368 | 4 |
| 1847 | 0.000000 | 0.013707 | 0.001357 | 0.001605 | 0.064255 | 4 |
| 1848 | 0.003112 | 0.799316 | 0.005292 | 0.004669 | 0.108407 | 1 |
| 1849 | 0.002081 | 0.036567 | 0.074419 | 0.004268 | 0.109792 | 4 |
| 1850 | 0.000000 | 0.008555 | 0.000351 | 0.000218 | 0.025054 | 4 |
| 1851 | 0.000000 | 0.013233 | 0.053029 | 0.001915 | 0.080267 | 4 |
| 1852 | 0.000212 | 0.005736 | 0.325473 | 0.000000 | 0.031981 | 2 |
| 1853 | 0.000584 | 0.038716 | 0.025864 | 0.002145 | 0.145968 | 4 |
| 1854 | 0.000119 | 0.003479 | 0.000169 | 0.000041 | 0.022914 | 4 |
| 1855 | 0.002055 | 0.085641 | 0.001225 | 0.001940 | 0.105661 | 4 |
| 1856 | 0.000000 | 0.003561 | 0.005058 | 0.000496 | 0.023505 | 4 |
| 1857 | 0.000215 | 0.024078 | 0.026162 | 0.001679 | 0.093335 | 4 |
| 1858 | 0.000461 | 0.043597 | 0.005332 | 0.002511 | 0.182829 | 4 |
| 1859 | 0.815943 | 0.005315 | 0.002253 | 0.096220 | 0.016251 | 0 |
| 1860 | 0.000204 | 0.008213 | 0.000411 | 0.000886 | 0.031156 | 4 |
| 1861 | 0.000484 | 0.039386 | 0.002322 | 0.000744 | 0.102555 | 4 |
| 1862 | 0.001384 | 0.331846 | 0.003938 | 0.002497 | 0.143250 | 1 |
| 1863 | 0.000000 | 0.016161 | 0.001455 | 0.000000 | 0.044579 | 4 |
| 1864 | 0.001069 | 0.019280 | 0.067558 | 0.003624 | 0.157591 | 4 |
| 1865 | 0.000292 | 0.016024 | 0.173275 | 0.001111 | 0.057925 | 2 |